Bookmarked Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence

Finalised in June, the AI Act (EU 2024/1689) was published yesterday 12-07-2024 and will enter into force after 20 days, on 02-08-2024. Generally the law will be applicable after 2 years, on 02-08-2026, with. a few exceptions:

  • The rules on banned practices (Chapter 2) will become applicable in 6 months, on 02-02-2025, as will the general provisions (Chapter 1)
  • Parts such as the chapter on notified bodies, general purpose AI models (Chapter 5), governance (Chapter 7), penalties (Chapter 12), will become applicable in a year, on 02-08-2025
  • Article 6 in Chapter 3, on the classification rules for high risk AI applications, will apply in 3 years, from 02-02-2027

The purpose of this Regulation is to improve the functioning of the internal market by laying down a uniform legal framework in particular for the development, the placing on the market, the putting into service and the use of artificial intelligence systems (AI systems) in the Union, in accordance with Union values, to promote the uptake of human centric and trustworthy artificial intelligence (AI) while ensuring a high level of protection of health, safety, fundamental rights as enshrined in the Charter of Fundamental Rights of the European Union (the ‘Charter’), including democracy, the rule of law and environmental protection, to protect against the harmful effects of AI systems in the Union, and to support innovation. This Regulation ensures the free movement, cross-border, of AI-based goods and services, thus preventing Member States from imposing restrictions on the development, marketing and use of AI systems, unless explicitly authorised by this Regulation.

Bookmarked Commission opens non-compliance investigations against Alphabet, Apple and Meta under the Digital Markets Act (by European Commission)

With the large horizontal legal framework for the single digital market and the single market for data mostly in force and applicable, the EC is initiating first actions. This announcement focuses on app store aspects, on steering (third parties being able to provide users with other paths of paying for services than e.g. Apple’s app store), on (un-)installing any app and freedom to change settings, as well as providers preferencing own services above those of others. Five investigations for suspected non-compliance involving Google (Alphabet), Apple, and Meta (Facebook) have been announced. Amazon and Microsoft are also being investigated in order to clarify aspects that may lead to suspicions of non-compliance.

The investigation into Facebook is about their ‘pay or consent’ model, which is Facebook’s latest attempt to circumvent their GDPR obligations that consent should be freely given. It was clear that their move, even if it allows them to steer clear of GDPR (which is still very uncertain), it would create issues under the Digital Markets Act (DMA).

In the same press release the EC announces that Facebook Messenger is getting a 6 month extension of the period in which to comply with interoperability demands.

The Commission suspects that the measures put in place by these gatekeepers fall short of effective compliance of their obligations under the DMA. … The Commission has also adopted five retention orders addressed to Alphabet, Amazon, Apple, Meta, and Microsoft, asking them to retain documents which might be used to assess their compliance with the DMA obligations, so as to preserve available evidence and ensure effective enforcement.

European Commission

Bookmarked Coding on Copilot: 2023 Data Suggests Downward Pressure on Code Quality by William Harding and Matthew Kloster

Gitclear takes a look at how the use of Copilot is impact coding projects on GitHub. They signal several trends that impact the overall code quality negatively. Churn is increasing (though by the looks of it, that trend started earlier), meaning the amount of code very quickly being corrected or discarded is rising. And more code is being added to projects, rather than updated or (re)moved, indicating a trend towards bloat (my words). The latter is mentioned in the report I downloaded as worsening the asymmetry between writing/generating code and time needed for reading/reviewing it. This increases downward quality pressure on repositories. I use GitHub Copilot myself, and like Github itself reports it helps me generate code much faster. My use case however is personal tools, not a professional coding practice. Given my relatively unskilled starting point CoPilot makes a big difference between not having and having such personal tools. In a professional setting more code however does not equate better code. The report upon first skim highlights where benefits of Copilot clash with desired qualities of code production, quality and team work in professional settings.
Via Karl Voit

To investigate, GitClear collected 153 million changed lines of code,
authored between January 2020 and December 2023….. We find disconcerting trends for maintainability. Code churn — the
percentage of lines that are reverted or updated less than two weeks after
being authored — is projected to double in 2024 compared to its 2021,
pre-AI baseline. We further find that the percentage of “added code” and
“copy/pasted code” is increasing in proportion to “updated,” “deleted,” and
“moved” code.

Gitclear report

Bookmarked Internet of Things and Objects of Sociality (by Ton Zijlstra, 2008)

Fifteen years ago today I blogged this brainstorming exercise about how internet-connectivity for objects might make for different and new objects of sociality. A way to interact with our environment differently. Not a whole lot of that has happened, let alone become common. What has happened is IoT being locked up in device and mobile app pairings. Our Hue lights are tied to the Hue app, and if I’d let it collect e.g. behavioural data it would go to Philips first, not to me. A Ring doorbell (now disabled), our Sonos speakers are the same Those rigid pairings are a far cry from me seamlessly interacting with my environment. One exception is our Meet Je Stad sensor in the garden, as it runs on LoRaWan and the local citizen science community has the same access as I do to the data (and I run a LoRa gateway myself, adding another control point for me).

Incoming EU legislation may help to get more agency on this front. First and foremost, the Data Act when it is finished will make it mandatory that I can access the data I generate with my use of devices like those Hue lights and Sonos speakers and any others you and I may have in use (the data from the invertor on your solar panels for instance). And allow third parties to use that data in real time. A second relevant law I think is the Cyber Resilience Act, which regulates the cybersecurity of any ‘product with digital elements’ on the EU market, and makes it mandatory to provide additional (technical) documentation around that topic.

The internet of things, increases the role of physical objects as social objects enormously, because it adds heaps of context that can serve relationships. Physical objects always have been social objects, but only in their immediate physical context. … Making physical objects internet-aware creates a slew of possible new uses for it as social objects. And if you [yourself] add more sensors or actuators to a product (object hacks so to speak), the list grows accordingly.

Ton Zijlstra, 2008

We had a fun first visit to the local CoderDojo this afternoon, with the three of us. Y animated dinosaurs and created an earth with wobbly eyes that followed the mouse pointer.


Y working in Scratch on some animated dinosaurs

A month ago, Y had a ‘programming day’ at school where people from De Programmeerschool worked a full day with her class. She liked working in Scratch, so I suggested we visit the local CoderDojo. Next time I think we should try and bring a friend. She invited a friend this time, but there were no more tickets available (although there was still plenty space on-site).

I key-noted at the Dutch DojoCon in 2019, and then became a member of their Club of 100, donating money every year. And today I was able to bring my daughter and enjoy the activities. I first came across CoderDojo in Limerick, Ireland during 3D Camp in 2014 co-organised by our friend Gabriela Avram.

Bookmarked China Seeks Stricter Oversight of Generative AI with Proposed Data and Model Regulations (by Chris McKay at Maginative)

Need to read this more closely. A few things stand out at first glance:

  • This is an addition to the geo-political stances that EU, US, China put forth w.r.t. everything digital and data. A comparison with the EU AI Regulation that is under negotiation is of interest.
  • It seems focused on generative AI solely. Are there other (planned) acts covering other AI applications and development. Why is generative AI singled out here, because it has a more direct population facing character?
  • It seems to mostly front-load the responsibilities towards the companies producing generative AI applications, i.e. towards the models used and pre-release. In comparison the EU regulations incorporates responsibilities for distributors, buyers, users and even users of output only and spans the full lifetime of any application.
  • It lists specific risks in several categories. How specific are those worded, might there be an impact on how future-proof the regulation is? Are there thresholds introduced for such risks?

Let’s see if I can put some AI to work to translate the original Chinese proposed text (PDF).

Via Stephen Downes, who is also my source for the link to the original proposal in PDF.

By emphasizing corpus safety, model security, and rigorous assessment, the regulation intends to ensure that the rise of [generative] AI in China is both innovative and secure — all while upholding its socialist principles.

Chris McKay at Maginative