Software, including open source, is becoming regulated the world over. This lengthy blog post explains the background to the Cyber Resilience Act in the European Union, what is good, its flaws and the likely negative impact on open source. And it also explains the arcane process by which it moves through the EU system, to help understand the timeline and how to make a change.
If you are looking for a more verbal introduction – Mike Milinkovich at Eclipse gave a very up-to-date and lucid presentation that covers the same ground. If you are more into short calls to action – then try GitHub, CNLL (in French), the Linux Foundation, or the more comprehensive response of the wider industry.
GitHub:https://github.blog/2023-07-12-no-cyber-resilience-without-open-source-sustainability/
Linux 基金会:https://linuxfoundation.eu/cyber-resilience-act
行业响应:https://ccianet.org/library/joint-recommendations-for-a-feasible-cyber-resilience-act/
Although the IT industry is still small compared to other large industries and sectors, over the past decades it has become crucial to society. It is now common to see large events in the software and IT industry in the news. And, more often than not, it’s a story triggered by some sort of disaster: a misconfiguration, bug, or criminals and state actors that “got in” apparently too easily. With poor IT practices now also affecting the major industries, from energy transport to manufacturing to finance to democratic processes and good government.
Because of this, societies and various governing bodies have certainly taken notice and, as a result, around the world, all sorts of software regulation and legislation are being prepared.
Using engineering history as an example, such regulation is a perfectly normal result. In the late 1800s, the mechanical industry saw incredible growth thanks, in part, to the invention of the steam engine. But as this industry grew; so did the number of accidents with exploding steam boilers. Often flattening half of a given town.
After the explosion of the steamship Sultana in 1865, which saw 1,167 people killed, pressure was placed on the industry in the United States. This resulted in the creation of the American Boiler Manufacturers Association (ABMA) to start self-regulation of the industry. It took several hundreds of such explosions; and a particularly expensive one in 1905 at a Boston shoe factory for the intervention of government policies to come into being.
Interestingly, it was not ABMA that responded to the 1905 disaster, but a group of five engineers, members of the American Society of Mechanical Engineers, a professional organization of individuals, rather than companies. These people wrote the first version of the Boiler Code that subsequently was endorsed by the Massachusetts legislature shortly thereafter.
In many ways, these engineers, these individual volunteers “scratched an itch” to solve the problem; much like we do today in Open Source at the ASF as well as, for example, the Internet Engineering Task Force (which sets the standards for the Internet). It was the professional community which solved the problem: Not their employers,the industry, or the ABMA.
There is currently a lot of legislation in process in almost all parts of the world; with the US and the European Union slightly ahead (and with plenty of coordination between the policy makers of the various countries).
In this blog post we’ll focus on just one for now: the Cyber Resilience Act (CRA) in the EU, as that is “first” from a timeline perspective.
It is by no means the most important piece of legislation. At the ASF we gauge the impact of the EU’s Product Liability Directive (introducing “strict-liability” to software), the US Executive Order 14028, “Improving the Nation’s Cybersecurity” and the “Securing Open Source Software Act of 2023“ (US), as perhaps having an even larger impact.
美国的第 14028 号行政命令 "提高国家网络安全:
https://www.whitehouse.gov/briefing-room/presidential-actions/2021/05/12/executive-order-on-improving-the-nations-cybersecurity/
2023 年开源软件安全法案"(美国) :
https://www.congress.gov/bill/118th-congress/senate-bill/917/text
This may be especially true as the US legislation could set standards for that nation through the National Institute of Standards and Technology (NIST), which is typically faster than standards developed by the EU (and thus may well set the global standard).
In day-to-day practice, software developers rarely need to consider regulation (unless you work in some specific field, say medical, aerospace, finance, or nuclear). Open Source licenses (on our downstream outflow) and committer license agreements (on our instream) tend to have far-ranging disclaimers. And we often equate code to codified knowledge or speech.
However, in actual practice, things are not that simple. For example, here at the ASF we’ve had, over the years, the need to file some paperwork to let the Bureau of Industry and Security (BIS) in the United States know the exact location of cryptographic code that we make available for download [https://infra.apache.org/crypto.html]. And code distributed by the ASF can not be exported (or re-exported) to certain destinations or to people on a certain list.
| 译者注:特定目的地或特定名单上的人 - 出口 ASF 产品的规范:https://www.apache.org/licenses/exports/
In the EU the Cyber Resilience Act (CRA) is now making its way through the law-making processes (and due for a key vote on July 19, 2023). This act will apply to a wide swath of software (and hardware with embedded software) in the EU. The intent of this regulation is good (and arguably long overdue): to make software much more secure.
| 译者注:欧盟《网络韧性法案》https://www.european-cyber-resilience-act.com/,已经投票通过,但是引发了许多反对意见,后续发展值得观察。
The act attempts to do this in a number of ways. The most important is that the CRA will require the market to apply industry good practices to security when designing, building, releasing, and maintaining software. At a most basic level, the CRA formalizes what is by and large already policy at the ASF: manage your bugs and accept, triage, and fix security vulnerabilities. This is also done by pairing this with good governance or practices; such as registering CVEs when appropriate, doing release notes, and decent versioning (and in fairness, some of those we should further formalize and improve).
| 译者注:ASF 现行的基本政策
https://www.apache.org/security/committers.html
The CRA will also attempt to ensure that any and all software in the European market meets some sort of minimum level of security by fairly simple self-certification documented in a CE conformity declaration. Or, for software that is more critical, such as a firewall or a secure cryptographic key enclave, an actual “real” certification and audit by an external, regulated, and notified body. The CRA will also define a number of processes to monitor compliance in the market.
-
“CE” 标志是一种产品安全认证标志(即只限于产品不危及人类、动物和货品的安全方面的基本安全要求,而不是一般质量要求),被视为制造商打开并进入欧洲市场的护照。CE 代表欧洲统一(CONFORMITE EUROPEENNE)。 -
安全加密密钥飞地:是指硬件的处理器和内存的受保护部分。
EU policymakers recognize that these “industry best practices” are not yet well defined (within the industry in general, the ASF is the exception, not the rule) — and a lot of the CRA relies on international standards organizations to create the standards one can use to audit one’s project (self-certification) or that can be used by external auditors.
There is also an expectation that significant vulnerabilities will get special treatment – and that these will get reported early. More on that later
If you’ve followed the various blogs and letters, there has been a lot of focus by open source foundations to help refine the current wording of the CRA to make open source software “exempt”; i.e, have the CRA apply only when the code leaves the open source commons; and then continue to apply throughout the entire commercial supply chain. And also to stop the CRA from applying when something, e.g. a security fix, comes back and enters the commons again.
博客:https://blog.opensource.org/what-is-the-cyber-resilience-act-and-why-its-important-for-open-source/
公开信:https://blog.opensource.org/the-ultimate-list-of-reactions-to-the-cyber-resilience-act/
By and large, these efforts have not been successful. Successive versions of the documents changed considerably – but not around this specific policy issue.
To understand why, representatives of the ASF (together with OpenSSL), spoke directly to the EU on July 7, the first time we actually were able to interact with lawmakers in a meaningful way.
From this conversation, we learned that the policy makers are very aware that open source is crucial to the IT industry — both for “production” and innovation. And, because of this, they want to avoid killing the goose that lays the golden eggs.
| 译者注:政策制定者非常清楚
https://digital-strategy.ec.europa.eu/en/library/cyber-resilience-act-impact-assessment
On the other hand, EU lawmakers also realize that open source is often 95% or more of the software stack on which a typical European Small and Medium Enterprise (SME) operates or is licensed. And it is that entire stack which the SME, as the party that places it on the market, is liable for.
From what we understand, policymakers assume that these process improvements (and (self) certification) are costly; on the order of 25% more in cost overhead. This is based on recently introduced similar regulations in the medical sector and the CRA impact assessment (any EU law proposed needs to have its likely impact in economic terms documented).
So looking at the whole stack of an SME (i.e., 95% open source, 5% secret sauce), then for most European SMEs this extra effort over the full 100% would be several times their engineering effort and hence would not be feasible. Whereas, the thinking is at the EU, certifying the 5 or 10% of the code they build on top of the open source stack is a lot more achievable.
So, for this reason, the policymakers (1) have made it crystal clear to the ASF that they intend to have the CRA apply to open-source foundations. The current exceptions for open source are either for pure hobbyists, code that is not used in real life, or for things such as mirrors and package repositories like NPM or Maven Central. The way they do this is a presumption of commercial intent if the software is used anywhere in a commercial environment.
A piece of EU legislation is generally drafted by the European Commission (who also prepares things such as impact studies). It is then discussed in Parliament. This is generally done in smaller committees. These committees prepare reports and ultimately legislation then goes to a plenary session of the parliament for voting(2).
For the CRA the main committees are LIBE, IMCO, and ITRE.
The first, LIBE (Committee on Civil Liberties, Justice and Home Affairs) — where things such as `free speech’ are discussed — declined to produce a report. Next IMCO, the Committee on the Internal Market and Consumer Protection, looked at what is important for the consumers and the internal market. It produced a report that was fed into ITRE.
ITRE, the Committee on Industry, Research and Energy, has since produced a consensus document that is expected to be discussed publicly the week of 20230717 and gets its final committee endorsement (they generally do not vote on things when there is consent).
| 译者注:20230717 这一周进行公开讨论
https://www.europarl.europa.eu/committees/en/itre/home/highlights
Once this completes, the proposal goes through the European Parliament for voting. Depending on how controversial or consensual it is at that time, there may, or may not, be discussion and a free vote.
| 译者注:工作完成https://www.europarl.europa.eu/legislative-train/theme-a-europe-fit-for-the-digital-age/file-european-cyber-resilience-act
In the meantime – the third party of the EU – the Council – also prepares its version of the Act. These are essentially the relevant ministers of each country that look at it from a national perspective. The three versions (EC, Parliament and Councils) are then discussed, behind closed doors, in the Trialogues – which then yields the final version that becomes law.
Right now all parties in the lawmaking process are said to have reached a rough consensus – and two of them shared with the ASF their opinion that there is no controversy. Also, copies of the various consensus documents have leaked – so we know that they are not far apart, and we can now also start to analyze them.
The current definitions(3) are such that the CRA applies to the ASF, all of its (volunteer) developers, and all our output. And, as the ASF understands from its meeting with policy makers, this was intentional.
There are quite a few concerns with the CRA; but the following are probably tops for the ASF community.
No concept of a commons is distinct from the commercial market; it is an all-in approach: The first issue is that the CRA takes a binary all-or-nothing approach. You are either in or you are out. And when you are in – what is applied to you is, essentially, what needs to be applied to a full-blown commercial product that is sold to consumers.
While open source can be close to that (e.g., Apache Netbeans or Apache Zeppelin – albeit not sold) — open source generally is not part of that commercial setting. Instead, it may be managed as a piece of shared knowledge or a commons’. Much like for example academic papers or reference blueprints. The CRA does not acknowledge this – and hence applies itself in full’ (as opposed to for example just applying the elements of the CRA that could make sense in that context — such as good vulnerability handling, versioning and SBOMs).
The CRA would regulate open source projects unless they have “a fully decentralized development model.” However, a project where a “corporate” employee has commit rights would not be exempt (regardless, potentially, of the upstream collaboration having little or anything to do with their employer’s commercial product). And some projects, like the venerable OpenSSL project have an even more complex model.
| 译者注:OpenSSL 项目,其模式甚至更为复杂
https://www.openssl.org/blog/blog/2023/07/17/who-writes-openssl/
This turns the win-wins of open source on its head. If corporate maintainers are banned, corporations may pull back from allowing their employees to maintain projects, harming the open source innovation ecosystem and, ironically, undermining its resilience and its significant economic/growth generator (9bn€ per year according to the EU impact assessment).
It also makes it very hard to see who in the ASF community would do the extra (self) certification work that the ASF would need to do.
The net effect for this is actually quite broad. To give an example from the “Recitals(4), 10a” (and there are many such examples):
Similarly, where the main contributors to free and open-source projects are developers employed by commercial entities and when such developers or the employer can exercise control as to which modifications are accepted in the code base, the project should generally be considered to be of a commercial nature.
Here the lack of a transactional connection between those contributors and the commercial employers is problematic. For example, the developer could be an airline pilot employed by a commercial airline (i.e. a commercial entity) – who contributes to open source in their spare time: this part of the policy would make that contribution ‘commercial’. Also, at the ASF, the main contributors (committers) are of course able to exercise a level of control over what goes into a codebase(5).
| 译者注:意即与开源没有半毛钱关系的雇主航空公司也将遭受池鱼之殃,而被列入监管范围内。
And what makes matters worse is that the type of open source organizations most affected are also exactly those that, today, tend to have very mature security processes, with vulnerabilities getting triaged, fixed, and disclosed responsibly with CVEs to match. While it generally is further downstream; with the companies that place the product on the market — that the CRA needs to drive significant improvement. It now risks doing the reverse.
The CRA affects projects that are entirely volunteer-led and -driven (e.g. such as at the ASF) where no one company has any influence on what the product does and releases. Any project where an employee of a commercial entity has commit rights is affected.
This leads to the problem: that both commercial companies and open-source projects will need to be much more careful as to what committers can work on code, what funding they take, and what patches they can accept.
In the certification there is the strong assumption that (self) certification of modules is transitive’; i.e. that if you build something from certified modules, you only have to certify the few extra things you have done. Unfortunately, this is not true in general; certification is generally very much about showing how, as the final, liable organization, you have made sure that what you delivered is fit for the purpose you delivered it for the specific setting at your customer. Information that is not available `upstream’ at the open source organizations that self-certified building blocks.
The core of certification is to ascertain that what you release is suitably secure for its intended purpose. Specifically, you have done your security by design and mapped out your threat actors, vectors, and risks. And then made reasonable engineering compromises based on risk.
Unfortunately in open source, we often have no idea how our software is going to be used. And, as we’ve learned (the hard way) over the past decade, it is key for the good governance of our shared commons, that we do not discriminate or otherwise limit our licenses (in fact – that is part of the open source definition).
| 译者注:开源定义 Open Source Definition 是 OSI 制定关于开源许可协议的 10 条基本原则。违反这些原则的许可协议,不得自称为 ”开源” 许可协议。开源定义:https://zh.wikipedia.org/zh-cn/%E5%BC%80%E6%BA%90%E5%AE%9A%E4%B9%89
Some of the obligations are virtually impossible to meet: for example, there is an obligation to “deliver a product without known exploitable vulnerabilities”. This is an almost impossible bar to set; especially as open-source authors, neither know, nor have control over, how their code is integrated downstream.
The next problem is around standards. The CRA refers to a large number of `to be written’ international standards (generally assumed to be created at CEN-CENELEC). The IT industry in general, and open source in particular, does not have a great track record of working with these standard bodies — in part as almost all key internet standards (also at the ASF) are maintained at the IETF and W3C. In fact, it is not uncommon for the bylaws of these standards organizations to not allow open source organizations to be members in a meaningful way.
| 译者注:不允许开源组织以有意义的方式成为其成员https://blog.opensource.org/another-issue-with-the-cyber-resilience-act-european-standards-bodies-are-inaccessible-to-open-source-projects/
The CRA requires the disclosure of serious unpatched and exploited vulnerabilities to ENISA (an EU institution) within a timeline measured in hours before they are fixed. This is opposed to what is industry best practice — responsible disclosure of the fix and workaround.
And not only does this too-early reporting distract from getting a fix out – for international communities it is easy to run foul of other countries insisting on the same information or, worse, prohibiting such sharing. Thus breaking the very core of the fair and equitable reporting culture that open source relies on.
And, as this information is only useful to ENISA when it is then widely shared — it is rational for organizations to choose the prudent, globally ‘fair’ option and take the easy path out: ensure you never hear about them. Or, the opposite, simply makes things public right before your (first) reporting deadline rolls over, i.e., before they are fixed.
| 译者注:意即在问题解决前,不对外公布;或是一有问题,立即全球公布。而不是优先将问题只通知某些特定机构,如欧盟的 ENISA。
So this is yet another example where, with all its good intentions, the CRA may end up accomplishing the exact opposite.
Looking at the IT industry in Europe now, one can observe that it is generally not open source (especially coming from the likes of the ASF) which is the root cause for the sorry state of security in the IT industry. Quite the contrary.
While, in contrast, most SMEs in Europe rarely update their dependencies and are generally not well-versed in dealing with security issue reports. And (regular) updates at the ASF creating even more (re)certification work for them may make them even slower to pick up on our updates and security fixes.
However, there is also a lot in the CRA that is feasible, and where we know that it is likely going to be effective; also at the level of open source organizations such as the ASF.
In fact, we do most of this already today, such as good triage of vulnerability reports, responsible disclosure, registering CVEs, and being careful with version numbers. And to this we apply good governance, with board reporting by the projects and the occasional project that gets moved to the Attic when their time has come.
The problem is more that the CRA also piles on a whole range of requirements that are either threatening the very fragile “win-win” of open source contributions or our commons, which go against industry good practices or are downright impossible, i.e. it tries to treat the open source commons identical to the commercial sector.
In fact, the USA appears to realize this and is taking the path with NIST to work with the industry to document these existing good practices.
And to some extent – it appears that the US is closer to the historical engineer and individual-led ACME process that produced the Boiler code, while the EU seems to be more on the path of asking manufacturers, rather than experts.
There is of course an elephant in the room: the well-oiled mechanism that “The internet treats censorship as a malfunction and routes around it” (John Perry Barlow).
| 译者注:房间里还有一头大象:意指 “一个问题因太过于庞大或麻烦,导致没有人愿意去碰”
We saw that mechanism come into action in the 90s when the USA tried to regulate cryptographic software. And only “export strength” cryptography could leave the US. That led to a lot of cryptographic industry and staff leaving the US, physically and legally; and a move of that industry from the USA to Europe. From there the companies would then simply import their code back into the USA or ship it from Europe, unencumbered by USA BXA rules, to the rest of the world. It took over two decades for this to normalize (and we still have the vestiges of that at the ASF)
| 译者注:ASF 中仍然可以看到这种情况的残余https://www.apache.org/licenses/exports/
So, as the ASF, we also need to factor in the risk that our communities may split on the CRA. Especially if our European communities are not able to muster enough capacity and capability to implement the CRA at the ASF.
The week of July 17, 2023 will see the ITRE vote. This is the parliamentary committee that recommends to the Members of the European Parliament how to vote. Once that is done, the Trialogues will likely start after the Summer 2023 recess. If the consensus between the three powers holders (as they appear for now) – this process may conclude as early as December.
So, in the very short term, one can reach out to the MEPs of ITRE. It generally helps if these messages are polite, sent by a party with some political or economic standing (e.g. the CEO, a SME organization) and are tuned to your local setting, such as to a parliamentarian of your own country in your own local language, and mindful of the political position of the party they represent. As the regulation of open source is intentional, and there are also a lot of common sense, good (open source) practices, in the CRA: the expectation is that we are past the point where asking for a blanket exception is productive.
| 译者注:工业、研究和环境部的欧洲议会议员
https://www.europarl.europa.eu/committees/en/itre/home/members
At the ASF we expect to focus on the Council version (as its text generally `wins’ and right now is a bit better than the ITRE consensus text). For this we can use your help: in particular, if you can help us get the executives of larger SMEs in your country engaged and willing to explain the impact at a national level (just contact the VP of Public Affairs; dirkx@apache@org).
| 译者注:最后的几个段落的诉求看起来比较隐晦,其实就是说 ”开源尚未成功,同志仍需努力”!
编辑 | 王军
COSCon'23 开源市集:共赴一场草坪上的开源派对
本文分享自微信公众号 - 开源社KAIYUANSHE(kaiyuanshe)。
如有侵权,请联系 support@oschina.cn 删除。
本文参与“OSC源创计划”,欢迎正在阅读的你也加入,一起分享。