Three Horizons / Internet of Plastic Things/ Rethinking the Legal Treatment of Robots/ An Industrial Revolution for Biotech Research

Three Horizons / Internet of Plastic Things/ Rethinking the Legal Treatment of Robots/ An Industrial Revolution for Biotech Research

The Three Horizons is a powerful instrument and framework for thinking about the future by Bill Sharpe. I had the pleasure to be introduced to Bill and the framework as part of the work I am doing for Countdown. I have been exposed to many frameworks on change and this one truly stands out, in my view at least, for its marriage of depth through the systemic approach and simplicity, which makes it easy to understand and apply.

It is about three different ways of looking at the future, Horizon1 is looking at the future as “business as usual”, Horizon3 is the vision of a viable (new and better) future, and Horizon2 is what connects the two, with the innovation needed to get to the vision. Horizon2 is the one that can tip the scale, is the innovation captivated by “business as usual” (H2-) then the status quo prevails, but if it manages to be harnessed for the vision (H2+), it is what can make it happen.

No alt text provided for this image

Kate Rosworth, author of the doughnut economics (another powerful concept…), has made a great overview of the framework in this video, which I highly recommend. If you want to go deeper the first place to start is of course the book, but this web site does a good job as well.

After the interaction with the Three Horizons, what really is now stuck with me is the importance of having H1, H2 and H3 work together with the right perspective, reinforcing each other. After having spent my whole life in “H3 land” I have eventually come to recognize the importance of a constructive (not antagonist) relationship with H1, and the importance of turning H2 into H2+. 

I love to think of The Antindisciplinarian as a H2+ place where we foster innovation to support Horizon3.


No alt text provided for this image

Coronavirus Spurs an Industrial Revolution for Biotechnology Research

"We're moving away from designing labs for people to designing them for the tools we use." - Jennifer DiMambro, Arup's science and business leader for the Americas

In August, I covered SynBioBeta's piece, Will programming a cell ever be as easy as programming an app? where I glimpsed the idea of the synthetic biology "app", defined by life sciences consultant Matthew Kirshner as "a product produced by a microbe (such as silk or food protein), or the microbe itself (e.g., a bacterium that can substitute for traditional fertilizer)." The same month, IBM revealed RoboRXN, a drug-making lab housed completely in the cloud, with the aim of helping WFH scientists design and create new molecules at an accelerated pace (see the software and lab in action here). This gave us a peek at the infrastructure behind the biological "app" - the equivalent of what Amazon Web Services is for hosted web applications.

The idea of combining software, biology, and robots isn't new - we've seen research going back to the early 2010s - but the pandemic has accelerated the automation and digitization of bioscience research. Axios gives us a brief dive into the organizations pursuing "smart facilities" (or "virtual remote labs"... but we're likely to see many more applications crop up as the space matures). "Smart facilities can track everything done in a lab, automating data collection that in the past might have been kept on pad and paper and allowing researchers to maximize workflow," writes the publication. Furthermore:

●     "'We now have customers who can leverage robots to do a thousand experiments at once that in the past would have been done manually,' says Saji Wickramasekara, the CEO of the cloud-based informatics platform Benchling. 'You can get far more scale.'"

●     "Ginkgo Bioworks invested $400 million over the past five years to build a 100,000-square-foot automated bioengineering facility that more closely resembles a biological factory than a traditional lab."

●     For a deeper dive, Arup recently published a Future of Labs report.

No alt text provided for this image

Most Plastic Recycling Produces Low-Value Materials – But We’ve Found a Way to Turn a Common Plastic Into High-Value Molecules

No alt text provided for this image

"The volume of plastic the world throws away every year could rebuild the Ming Dynasty’s Great Wall of China – about 3,700 miles long." - The Conversation

Plastics are (barring extreme heat) extremely strong. This comes at the cost of slow decomposition, lasting several hundred years and perpetuating the world's pollution problem. Distinguished Professor of Chemistry at UC Santa Barbara Susannah Scott, writing for The Conversation, wants a way to utilize our plastic waste and turn it into something much more valuable - while keeping it out of our environment.

Plastics are made by "stringing together a large number of small, carbon-based molecules in an almost infinite variety of ways to create polymer chains." While recycling facilities could melt and reshape plastic, the result loses the original material's properties and is instead relegated to lower-value use cases like plastic lumber. Heating the material also wastes energy (aka more emissions). Scott and her team say they've discovered a clean way to turn polyethylene - one of the world's most used types of plastic - into useful smaller molecules:

"The process we have developed does not require high temperatures, but instead depends on tiny amounts of a catalyst containing a metal that removes a little hydrogen from the polymer chain. The catalyst then uses this hydrogen to cut the bonds that hold the carbon chain together, making smaller pieces. The key is using the hydrogen as soon as it forms so that the chain-cutting provides the energy for making more hydrogen. This process is repeated many times for each chain, turning the solid polymer into a liquid.... which are useful as solvents and can easily be turned into detergents. The global market for this type of molecule is about US$9 billion annually."


No alt text provided for this image

Why the Next Truck You See May Be a Quiet, Zero-Emission Hydrogen Fuel Cell Rig

An ambitious startup and several established automakers are promising to deliver hydrogen-powered semi-trucks that would create zero harmful pollutants.

Synthetic Biology Startup AbSci Raises $65M to Expand ‘Protein Printing’ Tech

Vancouver, Wash.-based biotech company AbSci raised an additional $65 million to help grow its synthetic biology platform.

Time Crystals May Be the Next Major Leap in Quantum Network Research

A team based in Japan has proposed a method to use time crystals to simulate massive networks with very little computing power.


No alt text provided for this image

It’s Time to Rethink the Legal Treatment of Robots

Though laws don't directly encourage businesses to automate processes, they indirectly favor automation "because labor is taxed more than capital," writes Professor of Law and Health Sciences at the University of Surrey School of Law Ryan Abbott. Take one example: "If a chatbot costs a company as much as before taxes as an employee who does the same job... it actually costs the company less to automate after taxes." Besides the avoidance of wage tax, "businesses can accelerate tax deductions for some AI when it has a physical component or falls under certain exceptions for software." Since AI isn't a taxpaying citizen, indirectly incentivizing automation can hurt government programs, writes Abbott.

So... should we tax robots? Bill Gates proposed this back in 2017 - the same year EU actually rejected a robot tax. Abbott isn't against automation, but notes that it's "critical to craft tax-neutral policies to avoid subsidizing inefficient uses of technology and to ensure government revenue." He proposes a system of tax neutrality between people and AI, and pushes for simplicity: "though new tax regimes could directly target AI, this would likely increase compliance costs and make the tax system more complex."

"A better solution would be to increase capital gains taxes and corporate tax rates to reduce reliance on revenue sources such as income and payroll taxes. Even before AI entered the scene, some tax experts had argued for years that taxes on labor income were too high compared with other taxes. AI may provide the necessary impetus to finally address this issue." For more on the topic, see National Review's America’s Federal Robot Subsidy Hurts Workers.

No alt text provided for this image

Scientists Have Piloted a Tumbling Microbot Inside an Animal Colon for the First Time

No alt text provided for this image

Microbots and magnets seem to go together like peanut butter and jelly. Over the past decade, we've seen researchers extol the potential of external-magnet-operated, battery-less tiny robots for precisely targeting cancer cells and connecting neural networks. Purdue University's researchers have been experimenting with magnetically-controlled microbots since at least 2014. Recently, the university's scientists have published a paper (and the video below) describing how they managed to manipulate a microbot through an animal colon for the first time (see their 2018 video for additional details).

"When we apply a rotating external magnetic field [like in a commercial MRI system] to these robots, they rotate just like a car tire would to go over rough terrain," said Purdue mechanical engineer David Cappelleri. In humans, in addition to the use cases mentioned, such microbots could be used for tissue collection (reducing the need for minimally invasive surgery) and payload delivery.

No alt text provided for this image


No alt text provided for this image

AI-powered Mini-Brain Help Robots Recognize Pain and Self-Repair 

NTU Singapore scientists develop "mini-brains" to help robots recognize pain and self-repair.

Agility Robotics Raises $20M

The Oregon startup plans to scale manufacturing of its bipedal robots for logistics, retail, and e-commerce clients.


No alt text provided for this image

Here Comes the Internet of Plastic Things, No Batteries or Electronics Required

No alt text provided for this image

Wi-Fi Backscatter is... well, before we get to that, let's define RF-powered computing. According to IoT Agenda, "RF-powered computing is the use of radio frequency (RF) signals to enable the operation and communication of low-power devices, typically for machine-to-machine (M2M) networking." RF-powered devices subsist solely on energy from radio frequencies - aka no batteries required. But they still need something like an RFID reader to connect to the internet. Wi-Fi Backscatter, according to the University of Washington page, allows the reuse of existing Wi-Fi infrastructure to provide connectivity to RF-powered devices. Backscatter tech was prototyped in 2014. Recently another team at the University of Washington leveraged the technology to create electronics- and battery-free plastic devices using off-the-shelf 3D printers, which are able to communicate with Wi-Fi devices like smartphones.

"Once the reflective material was created, the next challenge for the researchers was to communicate the collected data. The researchers ingeniously translated the 0 and 1 bits of traditional electronics by encoding these bits as 3D printed plastic gears. A 0 and 1 bit are encoded with the presence and absence of tooth on the gear, respectively. These gears reflect the WiFi signal differently depending on whether they are transmitting a 1 bit or a 0 bit. 'The way to think of it is that you have two parts of an antenna,' explained Shyam Gollakota, an associate professor at the University of Washington."


No alt text provided for this image

New Virtual Reality Software Allows Scientists to 'Walk' Inside Cells

vLUME allows super-resolution microscopy data to be visualised and analysed in VR.

AR Is Finally Infiltrating Everyday Tasks Such as Google Search

Google would like to use your phone’s camera for augmented-reality overlays of search results on your view of the world.


No alt text provided for this image

AI-Driven Dynamic Filmmaking Is the Future

No alt text provided for this image

"Dynamic Films take the audience on a journey through branching pathways, toward a moment of inner change, making them see the world a little differently than before they began." - Agence director Pietro Gagliano

VR experience Agence falls somewhere between a film and a game - and while dynamic film is a bit of a vague term, it seems appropriate for an experience that uses reinforcement learning (an area of machine learning) to control its animated characters. Director Pietro Gagliano - self-described "international-award winning man-child creative" - sees Agence as a sort of "silent-era dynamic film. It’s a beginning, not a blockbuster." The plot centers around "a group of creatures and their appetite for a mysterious plant that appears on their planet. Can they control their desire, or will they destabilize the planet and get tipped to their doom?" writes MIT Technology Review.

We usually think of experiences like Bandersnatch or games like Heavy Rain as being both dynamic and cinematic, but what Gagliano had in mind was a bit more... emergent. The difference with a dynamic film, according to the director's guest post on VRFocus, is that it "allows users to interact with emergent narrative without the consequence of winning, failure, progress or defeat. It creates a path to great storytelling with unique outcomes from every interaction." Gagliano sees media involving past a "one-way street," morphing into an "interrelated, interdependent dance between user and algorithm, turning what we now call films and games, into real, living simulations."


No alt text provided for this image

New MIT Algorithm Automatically Deciphers Lost Languages

A new AI system can automatically decipher a lost language that’s no longer understood — without knowing its relationship to other languages.


No alt text provided for this image

Attack Drones Dominating Tanks as Armenia-Azerbaijan Conflict Showcases the Future of War

The atrocious damage done by chemical weapons during WWI led to the signing of the Geneva Protocol in 1925, which prohibited their use. It didn't, however, ban development, production, or stockpiling. It was only in 1997 that the Chemical Weapons Convention went into effect and prohibited the "development, production, acquisition, stockpiling, retention, transfer or use of chemical weapons by States Parties." As for more modern weapons, the International Committee of the Red Cross already prohibits blinding laser weapons (this is a fascinating read, especially for 1994). If it took over seventy years to get a proper agreement in place for chemical weapons, how long will it take for autonomous killing machines - like those fictionalized in the Future of Life Institute's Slaughterbots video - to make the list?

While there is support in the research community for the banning of lethal autonomous weapons, some military theorists and computer scientists believe robots are not "subject to all-too-human fits of anger, sadism or cruelty" and can be used as preventative measures: "According to Michael Schmitt of the US Naval War College, military robots could police the skies to ensure that a slaughter like Saddam Hussein’s killing of Kurds and Marsh Arabs could not happen again." Besides divergent opinions, the idea of programming laws and ethics into an autonomous machine also "raises enormous practical difficulties.... Like an autonomous car rendered helpless by snow interfering with its sensors, an autonomous weapon system in the fog of war is dangerous."

要查看或添加评论,请登录

Massimo Portincaso的更多文章

社区洞察

其他会员也浏览了