The European Commission plans to allow providers of artificial intelligence (AI) models like ChatGPT to draw up a code of conduct that civil society would consult on and determine compliance with in the short to medium term.
Civil society groups have been considering in recent months whether to get involved in the General Purpose AI (GPAI) Code of Conduct, sources familiar with the matter told Euraactive.
Meanwhile, Euraactive has learned that the European Commission is looking for consulting firms to draft these codes.
Codes of conduct are a key part of AI law, at least in the short to medium term.
GPAI providers such as OpenAI and Microsoft can use the Code to demonstrate compliance with the obligations until a harmonized standard is created, and the Commission can grant the Code general validity within the EU through implementing acts.
“If the drafting process for a universal AI code of conduct is not a diverse stakeholder one, including civil society, academics and independent experts, it could become an industry-led process, which essentially means big tech companies write their own rules,” one civil society figure, who declined to comment on the evolving situation, told Euraactive.
The AI Act itself was vague, stating that providers, private organizations, and academia “may” participate in the creation of the code for general-purpose AI models.
According to information provided to Euractiv, model providers will primarily draft the code, with other stakeholders joining through consultations.
It remains unclear how this consultation will take place: it could mean one or more calls for input, or an invitation to be in the room in an observer capacity.
Sources familiar with the matter told Euraactive they took the announcement at the European Commission workshop in June as a sign that things are evolving and the Commission may now be more open to involving civil society.
“The call for expressions of interest, to be published shortly, will outline concrete ways in which these stakeholders can be involved in the development of the code,” a European Commission spokesperson told Euractive.
“The process of developing the code of conduct will be supported by a range of stakeholders, including private society,” a Commission spokesman said, without providing details on how this would be achieved.
If providers don’t sign the guidelines, which are essentially voluntary, they must independently certify that they are in compliance with the roughly 500-page law.
Drafting Process
According to information seen by Euraactive, the European Commission’s Directorate-General for Communications Networks, Content and Technology (DG CNECT) conducted the mini-competition based on an existing framework agreement that ended in June.
The process of creating and drafting the code would be outsourced to an external company, which would have to decide who would participate in the process, devise and implement a work program, set up a working group that would meet weekly, and draft the code within nine months.
As a result, the committee set an aggressive timeline for the consulting firm’s work.
Stakeholder consultations, agendas and methodologies will all need to be approved by the Commission’s newly established AI Office.
According to information seen by Euraactive, the AI Secretariat will oversee the drafting but will not be heavily involved beyond approving the final rules. The AI Committee, made up of experts from member states, will play a role similar to that of the European Commission in that it will not be heavily involved in the drafting process.
The winning companies will also develop best practices for the committee to assess GPAI risks, including systemic risks.
The implementation of the AI Act has been delayed by several weeks, with the Act now due to come into force in early August. The Act will then be fully implemented over the next two years, but a code of practice is due to be drawn up within nine months of the coming into force date.
The hiring of consulting firms to draft the code will be done on the basis of a framework agreement, a “multi-year contract that sets out the basic terms” for contracts to be signed later, Albert Sánchez-Graels, a professor at the University of Bristol who studies EU procurement processes, told EuraActive.
Companies participate in public tenders and are pre-selected for general tasks such as “assisting the Commission’s services in carrying out audits.” If specific work arises under a framework agreement, the Commission can hold smaller tenders among the pre-selected companies.
The Participation Conundrum
Such an industry-led process “risks leading to a dishonest implementation of AI law and would not adequately protect the safety and fundamental rights of EU citizens,” said one of the people, who spoke on condition of anonymity.
These stakeholders raised their concerns to the European Commission at a separate workshop in late April, people who attended the meeting told Euraactive and confirmed on LinkedIn by Euraactive’s former technical editor Luca Bertuzzi.
The European Commission made it clear to the groups that it would not be a driving force behind the legislation. Several of them had at that point sent letters expressing their dissatisfaction, but most had yet to receive a full response, a person familiar with the matter told Euraactive.
The EU executive has already faced criticism over its handling of the implementation of AI law, with the appointment of a rapporteur for the file in the AI office without any public job advertisement or explanation, as first reported by Context and subsequently by Euraactive, causing a stir.
A group of more than 30 civil society organisations, including the leading consumer organisation BEUC, have questioned the independence of the national authorities tasked with enforcing the law. In an open letter on June 26, they called on the European Commission to clarify these roles.
Three MEPs who were re-elected in June’s EU elections sent questions to the European Commission in April about the staffing process for the secretariat. They have yet to receive an official response.
[Edited by Alice Taylor/Zoran Radosavljevic]