The EU "AI 24 election act" is far too incomplete to work.

The EU "AI 24 election act" is far too incomplete to work.

The EU AI concept text expected Q2/3 2024. That then needs ratification maybe end 24 /25. The @EUparliament elections are june 2024.

The document to be finalized is an election stance for both the EU as for USA 2024. The United States issued an AI Executive Order. The UK government released a non-binding Declaration of Principles. China imposed a light-touch business-friendly AI regulation(Zhang, 2024).

The EU human centric view discussed on AI probably generates some hard headlines by the more corporate friendly anglo saxon view.

The daring confrontational EU AI position must be placed in this larger geopolitical arena. The EU itself is under pressure to win the Ukraine war.

Putin told on Russian state TV: "Whoever rules AI, rules the world"

Lost time. Nothing is done since my professor told me why he expected so much of AI. It never let me loose. I researched rule based agent dased AI some years with different results. All thoose decades and 7 generations of AI further our governments always missed the basics.

EU is just too slow in making their legislation. The architects of this AI Act told the European Parliament that this regulatory mix of product safety and fundamental rights criteria is not adapted to these latest AI models!

The latest general purpose Large Language Models (#LLM) like ChatGPT, Llama or Google’s Gemini, that almost have an infinite range of purposes. The new law just do not fit theese new technologies.

My professor learned me always to give an example to proof the stance. Here it is. Take an private AI model. Add your own data to enhance it. You easily sail around all EU regulation unseen this way. You totally run in private mode! Run one of the many free OpenAI models Nvidea using Vmware. Call me if you can't manage fast!

There are by now several dozen large AI foundation models. The EU is currently not home to very large AI models. for a very large model you need to own the harware layer (like above) as well.

Both the AI act and the new EU privacy act make it virtual impossible to create such models. Let alone Europe doesn't have the hardware and people needed.

An example. Creative artists who use AI start to claim copyright on outputs sail around the laws. EU asks for watermarking obligation does not apply when AI only assists humans. Human-written ‘prompts’ regulation is simply not be enough. A Chinese court recently granted copyright to a developer using prompts (Wang and Zhang, 2024). Chineese copyright was already a hard nut never cracked. The EU solution to inject some certificates is not enough. As specialilist in compliance & certification tech I do shiver. It's a missing chapter/book in the AI act. Please call me if you need more. Clearly the EU digital sovereignty is at stake!

EU Peopulation already Lost our privacy most agree. We soon loose our collective knowledge easely.

The fundamentals of the EU AI are missing.

My favorite missing part in EU AI law. The AI sandbox.

Why sandboxes don't work: Testing can find one error. Maybe more, say 10. But how you are sure you found ALL errors?

Sandboxes are based upon testing and that doesn't work. Many studies shows a zero success Rate.

Old fashioned sandbox testing does not solve that problem. The impact of using the wrong method is severe because AI is:

  1. all about knowledge & privacy. Governments will insist on finding all errors because it could be IP theft as well.
  2. how to get rid of errors in a model that't not tracable.
  3. How to a real world environment as the draft act claims.
  4. How to remove errors in distributed oversea models.
  5. Size of the world make an easy copy for testing as you are used too.
  6. Testing requires many programmers that are just not available.
  7. Programmers are not well educated.
  8. An solution based on Formal verification, blockchain is missing.
  9. Testing or work done manual in a digital environment causes "digitization asymmetry". In this case It's hard to keep up with the need for speed if it comes to ad hoc changes. For example in eHealth regulations change on a daily bases. In this post I describe how digitization asymmetry causes governmental tax losses. The culture of many short test-cycles is contributing to the congestion of sandboxes too. There is just too many quick code made and offered for testing. The testers do the job.

Sandboxes are everywhere where our governments want to verify ICT. Central banks or the EU and many others are using this wrong method. Even the Horizon2020 projects ask for testing instead of using the formal verification needed.

What is needed is formal proof based on mathematics and pure logic. Satoshi Nakamoto would not be happy when he would see this. His work only pure formal proof. That proof does not arrive by sitting together but from studding math, something society neglected. Is this the root-cause? Can someone link to a sustainable regulated sanbox environment?

Testing is by nature expensive. 70% the cost of the project at least. A lot of governmental money is lost and the results are just not there. Of course the expected promise of the blockchain doesn't arrive either.

The mistake of using sandboxes is logical. People are used to try things out, but for the blockchain it's just not the right method. The academics involved should have known better.

What's needed is formal verification or a proof strategy instead of testing.

The solution is another kind of "sandbox". In this post I describe the method of formal verification that also has room for scenario's and experimenting in another way, by other people. The Dutch blockchain knowledge association offers another type of "sandbox" based on formal proof or proof strategies for free. Here is my latest post: How any organization can grow & adapt organic using blockchain.

More thoughts you maybe like are in this post. Just as relevant. Our laws and regulations are cast in iron, but need to adapt and change! But how?

Here it is: How any organization can grow & adapt organic using blockchain.

(C) 2024 Arnoud Berghuis.

hashtags: #EUAI2024 #AI #EUregulations #FutureofAI #DigitalTransformation #fintech #legaltech #taxes #smartcontracts #ethereum #BC4G #emancipation #blockchain #discrimination #alanceforbetter #computerscience #innovation#digitalassets #blockchainheadhunter #digitalassets #digital #money #woman #future #cryptocurrency #distributedledger #cryptoexchange #education #blockchainjob #recruiting #technology #innovation #finance

要查看或添加评论,请登录

社区洞察

其他会员也浏览了