Three years at AIML - what did I learn?

Three years at AIML - what did I learn?

I have recently stepped down from my role as Business Development Manager at the Australian Institute for Machine Learning (AIML). It has been a challenging and yet satisfying three years, discovering machine learning and artificial intelligence, and helping to grow what was already a very successful research group.

Before taking on this role, I had been running by own consulting business for 14 years. In this capacity, I played a small role in helping the University of Adelaide present its case to the South Australian government to establish the partnership that became AIML. Professor Anton van den Hengel, the Institute’s inaugural and extraordinary Director, asked if I would take on the role of Business Development Manager. The role promised less pay, more hours and more pressure. For some reason, I originally said no. But the idea stayed with me for a few months. Artificial Intelligence is a defining technology of the 21st Century. The ability to play a part in the development of this sector in Australia was intriguing. And this was a chance to build something – not just a research institute in a strategically important discipline – but a new model of collaboration between state government and universities in the development and application of core research capability. So I agreed a few months later.

This article is an opportunity for me to distil some of what I have learned over the past three years - for my own sake as much as anything else. Over this period, I have had the privilege of working with the brightest minds in artificial intelligence, collaborating with state government and the University of Adelaide on a new partnership model, sitting on the Global Partnerships in AI – an international effort to improve trust in AI, working with local SMEs to develop products to take to market, and engaging with some very passionate people who care deeply about how Australia will benefit or not from this new and very powerful form of technology.

What I am presenting below are my views only. They are views not just about artificial intelligence, but also the whole innovation system in Australia.

Note that this is a relatively long article and will take about 30 minutes to read from beginning to end.

If you don’t have that long, a simple summary is:

  • Artificial intelligence is a new way of creating software by training it on examples rather than writing a set of instructions. It is not intelligent or a replacement for humans, but it is a very clever way of looking for patterns in data, or in some cases, creating new patterns.
  • This new way of creating software relies very heavily on high quality data assets.
  • Artificial intelligence is changing the economics of production and the tactics of war and security.
  • Governments around the world are realising that there are a whole range of compelling economic, sovereignty and security issues around artificial intelligence and data, and so they are investing large amounts of funding in research, education and training. Australia has fallen behind in this regard.
  • Business is voting with their wallets. Artificial intelligence is attracting and creating eye-watering amounts of money. Despite great technical prowess, Australia is not extracting the economical potential it could be.
  • The threat to jobs comes not from the artificial intelligence itself, but by failing to invest in research, education and training.
  • If you are a woman, the opportunities in this sector are particularly compelling
  • It is the very best researchers and engineers who will make the breakthroughs in the science and performance of artificial intelligence. We should invest in making more of them.
  • Autonomous weapons and battle systems are emerging, are tricky to regulate, offer great advantages and great risks, and we need to engage deeply in their regulation and development.
  • Addressing the worse downsides of artificial intelligence will require new standards, strong institutions and high-quality education.
  • We need more organisations that aim to use artificial intelligence to make the world a better place. The opportunities are endless.
  • Australia’s innovation ecosystem is a bit of a shambles and urgently needs an overhaul.

1.??????Artificial intelligence – the ultimate misnomer

There are lots of definitions of artificial intelligence and they are probably all wrong. Artificial intelligence is definitely not intelligent, but the very term 'intelligence' is a reason it conjures up irrational fears about how it might compete with humans. A common definition runs along the lines that artificial intelligence "mimics the problem-solving and decision-making capabilities of the human mind". But it is more than that and not quite that. The term covers:

  • computer vision - the ability of computers to understand what they are 'seeing',
  • the ability to extract meaning from text,
  • the ability to analyse large amounts of information so as to detect patterns,
  • the creation of poetry and visual art, and
  • the ability to make independent decisions.

It might seem human at first glance, but it lacks at least one very important component that humans have - an endocrine system. A human has an ability to process information, but also has a suite of hormones that combines with the processing power of the brain to generate motivation, empathy, attachment, hate, desire and ambition. Artificial intelligence has none of these things. It does not wish to rule, nor will it be offended if you kill it off. It does not get tired or hungry. It does not wish for a higher status and it will not try to seek revenge. It does not have a set of values that are a combination of encoded survival instincts, hormone derived values, culturally evolved norms and logic. The difference is fundamental. Artificial intelligence is another way of processing information more effectively. It is not a replacement for humans, nor does it have the slightest motivation to replace humans. It is not intelligent, it is a clever way of looking for patterns in data, and in some cases creating new patterns.

While there are many tools that contribute to the technology library called ‘artificial intelligence', the technological innovation that has made all the difference has been machine learning. The traditional way of creating code – by writing a set of instructions, a recipe if you like - has been changed - to train a computer to do a task by giving it a lot of examples. This has opened up a whole range of new applications of software that were previously impossible to achieve with a ‘set of instructions’ approach. With machine learning, we can train a computer to recognise individual human faces by giving it a lot of examples of faces (try working out how you would do that through a set of instructions). We can train a computer to write poetry, just like Keats. Or like Iggy Pop, by giving it the poetry collections of the same. We can train a computer to detect fraud, translate between languages, assess contracts, count trees, keep us interested on Facebook, and find the information we were thinking of when we enter a search term into a search engine. All by giving it examples and letting it work out how to write the software to repeat the task. The weakness though, is that you need the right set of examples to do the training. If you train a facial recognition system with mostly white faces, that system will do poorly in recognising faces from other races. If you train a conversation bot from twitter, it will be racist, sexist and overly-aggressive. If you train a HR bot to select candidates using current selection data from humans, it will have a bias towards selecting males. If you train a system to detect cancer from a limited number of examples, it will miss the more obscure signals. So training data is really, REALLY important.

2.??????Artificial intelligence is changing the economics of production but Australia is not really participating

Artificial intelligence has changed how we find information, how we shop, how we interact with our friends and family, how we are entertained, transported and how we find potential mates. TRILLIONS of dollars in wealth has been created and destroyed as a result of this technology. The largest companies in the world are artificial intelligence companies - Google, Facebook, Amazon, Microsoft, Tesla, Nvidia, Alibaba, Tencent. On the US stock market, these companies have been the key drivers of wealth creation – from contributing 5% of the value of the US equities market to over 17% in just the last ten years.

  • Microsoft: $220B to $2,250B
  • Alphabet: $335B to $1,900B
  • Amazon: $100B to $1,730B
  • Facebook: $106B to $995B
  • Tesla: $2.5B to $767B
  • Nvidia: 8.4B to $550B

These six companies alone have increased from $772B to over $8000B in value (10x) - an increase in wealth of over $7 Trillion in that period. The total value of the US equity market increased from $15B to $47B (3x). And the affect is accelerating. Since the end of 2017, with the S&P 500 up 23%, these large artificial intelligence companies account for about 72% of that performance.?The artificial intelligence economy is accelerating much faster than the rest of the economy.

Unsurprisingly, investment in artificial intelligence globally is increasing rapidly. Venture capital investment in artificial intelligence over the past ten years has been $60B and investment within the big technology companies will have been much more than this. There are major investment funds established to invest in artificial intelligence into the near future including Softbank ($28B), Institutional Venture Partners, Andreessen Horowitz ($7B), Toyota AI Ventures ($100 million), Y Combinator, M12 AI Fund (Microsoft), and many many more. The International Data Corporation forecasts spending on artificial intelligence technologies to double from 2020 to 2024, to reach $110 billion per annum.?

Artificial intelligence is changing the economics of production. There are numerous studies that show that artificial intelligence improves accuracy, precision, performance and cost-competitiveness. Alibaba describes themselves as an artificial intelligence company. They have used AI to reduce friction for retail customers. In a recent Black Monday sale, some customers had products delivered to their home within ten minutes of pressing 'buy' because Alibaba had predicted where sales would be to the street, and had products waiting in trucks on that street with the right products as the sales started. Billions of dollars of cost and waste has been stripped out of their supply chains using prediction and analysis. Competitors who chose not to use AI are simply not going to be able to compete.

Australia is not really participating in this massive economic opportunity. The amount of investment in artificial intelligence ventures in Australia is in the order of $10Ms per annum - tiny when compared to what it listed above and in comparison with our international peers. At a recent meeting of the Australian AI Network, our artificial intelligence companies bemoaned the difficulty of raising capital in Australia compared to other countries.

It is a shame because Australia has world-class technical expertise to draw upon. The University of Adelaide, University of NSW and University of Melbourne rank between 3rd and 14th in global rankings for various disciplines of artificial intelligence. An Australian team led by CSIRO's Data61 recently came runner up in a global competition run by the US Defence Advanced Research Projects Agency on the operation of autonomous robots underground. At almost the same time, the University of Adelaide came third in the NASA Robotics Challenge. Our experts in artificial intelligence can compete above and below ground and even out into space with the best in the world, but we are failing to capitalise on this commercially at the moment (more later).

3. The threat to jobs is not from automation but from a lack of investment in education, training and research

There have been a number of questionable predictions that artificial intelligence will result in mass unemployment. I think the opposite should be of concern. Businesses today will need to adopt artificial intelligence to remain competitive. Those that do it well will be more successful and create more jobs. Jobs will be lost in those businesses that don’t. This is the history of automation generally. For example, in the 1980s and 1990s, car makers who adopted robotics thrived. Rather than losing jobs, they have increased their staff. Those that chose not to adopt robotics now no longer exist and all of the workers have had to retrain, and some never found work again – as workers in my home-town of Adelaide understand all too well. The existential threat to jobs belongs to those countries and companies who FAIL to adopt artificial intelligence.

More generally, as artificial intelligence automates some tasks, the price of those services will drop, because they are cheaper to produce. This will leave more money in people’s pockets so they can buy new services and products. This is exactly what happened as we automated agriculture. Food became much cheaper and is now a much lower proportion of our weekly spend. There are less jobs in agriculture, but this has freed human effort and capital resources to develop new products and services which in turn create new jobs. We are arguably all richer as a result.

The other reason we can be confident that automation will not result in mass unemployment is that it turns out that humans have infinite demands. As a result of all sorts of automation related to the internal combustion engine, computers, economies of scale and robotics, we have become a lot more efficient and productive since the 1950s. But we have not maintained the lifestyles of the 1950s and pocketed the change. We now buy coffee everyday, and have mobile phones and huge TVs and safer cars and more efficient whitegoods and wellness centres and trips to Bali and the list goes on. This extra demand has created all sorts of new jobs. And the big winner has been the technology sector itself. This sector is now worth $167bn per annum in Australia making it the third biggest industry in Australia today,?employing 861,000 Australians, and growing fast (Tech Council of Australia 2021).

Until humans reach a point whereby we are satisfied with what we have, artificial intelligence will simply shift spending patterns, creating new jobs and new industries. BUT ONLY IF we invest in training, education and research so that workers can develop the new skills and entrepreneurs can create the new products needed for the changing economy.

Since the 1950s, not all countries have succeeded in the way the modernised economies have - often due to failure to invest in education, training and research; corruption leaching wealth from the economy; war; past colonial injustices; and/or poor governance and institutional systems. The pathway to success to is to back the population with investments in education, training and research, and put in place strong and accountable institutional arrangements so that people can operate within an environment of trust and freedom.?It is institutional structures and choices about how community assets are deployed that will be the key driver of future employment in Australia and other countries.

4. Governments are realising that their nations must excel in artificial intelligence to remain competitive

Governments around the world have woken up to the power of artificial intelligence and how to harness it for national advantage. Many countries have released artificial intelligence strategies and made investments in research, training and education, including:

  • China: $25B
  • Germany: $3B over seven years
  • USA: $2B
  • South Korea: $2B over four years
  • Spain: $725M over three years
  • Singapore: $150M over three years
  • France: $1.8B over four years
  • Sweden: $585M over nine years
  • Japan: $550M
  • And many more I won't list but Michael Evans has written more about this here.

Australia has announced $124M as a part of its AI Action Plan – to advance translation of existing research and fund PhD scholarships. This is a good start but is still less that what Singapore (population 5.7 million) has announced and does little to lift the research capability of Australia.

5.?????Artificial intelligence and data are strategic assets that we should develop for Australia’s benefit

Data is one of the raw ingredients required to develop artificial intelligence. You cannot train a system to undertake a task if there is no historic data to train with, or the data is poor quality and/or biased. There is a global race to access quality datasets. And the reason is to create quality artificial intelligence systems. Australia has high-quality national datasets. We have fantastic industry datasets. These are datasets of national strategic importance.

Let me illustrate the sovereign importance of data with an example. Australian farmers have already had experiences whereby they have contributed data to a national dataset in order to create a helpful artificial intelligence system. This data was accessed by their customers who were seen as industry participants and so had rights to the data. These customers were able to use the data to back-engineer the costs of production. They then used this information to offer a lower price because they knew what the profit margin was likely to be. No matter how efficient artificial intelligence can make you, if you cannot sell at a profit, your business is bust.

Some datasets are sovereign assets which should be used for the benefit of the Australian people and our businesses. Data relating to health, education, national security, some strategic industries (Cybersecurity? Critical minerals?) should be reserved for Australian and in some cases only government owned enterprises. Governments and individual businesses should think carefully about what these are and protect them as a matter of urgency. Data Trusts which protect and manage data on behalf of the people who contribute the data which could be used to better manage collective data for the benefit of the contributors. Federated Learning allows for the development of artificial intelligence without the need for people to share data. Cybersecurity becomes even more important to protect these valuable assets. These are challenges that we must face now and develop solutions for.

6. The women drought in artificial intelligence

There is a massive shortfall of women who are skilled in artificial intelligence. From where I sit, I can see that the demand is huge and the supply is tiny. The reason for a low supply is tied up with the historic and systematic bias against women in computer science departments and workplaces, but also the nerdification of computer science that made it an uncool thing to be good at. I suspect 1980s and 1990s popular culture has to take at least some of the blame. But don't get me started down that rabbit burrow...

The tragedy of where we are at is that there has never been a better time for women to get into the field of artificial intelligence. Because the demand for workers in this field is expanding exponentially and the demand to reach gender targets in the male dominated industry is rightly a core concern of many businesses, the deals that women can get at the moment are spectacular.

If you are reading this as a women of any age, wondering what to do with your life, for goodness sake think seriously about looking at one of the many free machine learning coding courses at the very least (Fast.ai, Stanford University). Even better, go to university and get a PhD from an institution strong in the field (check out csrankings.com to find the good ones in Australia). You will need to be competent at maths and coding, but mostly be good at solving problems. Walk in and demand to be respected and treated like a queen because I am telling you now, they need you a whole lot more than you need them right now. If anyone gives you any disrespect, walk out on them and ask who else wants to give you a good deal.

It you are a computer science department in a university or a company that relies on artificial intelligence for your success, you had better make your workplace a heavenly environment of trust, appreciation and reward, and particularly for women.

7.??????It is the deep specialists in artificial intelligence who will create the breakthrough innovations - and we need a lot more of them

Turning data into an accurate predictive service, a trusted computer vision system, a writer of compelling poetry is both an art and a science. There is a massive difference between someone downloading an existing algorithm from an online library and running it over a dataset and a specialist who deeply understands the structure of data, statistics, maths, coding solutions and has a creative ability to find novel solutions to a problem. It is not a 10% or 20% improvement in productivity but more like a 10x or 20x improvement in productivity and in many cases the difference between a solution that works and one that doesn’t. There is a lot of bad artificial intelligence in the market at the moment, because of an under-supply of quality engineers.

We do need many people with skills in deploying existing algorithms to datasets, and at the moment, this is where much of the consulting-type work in artificial intelligence sits. But if Australia is to build the gold-standard solutions that solve our biggest challenges or create the next unicorn companies, we need to produce highly skilled technical specialists in artificial intelligence, at the post-graduate and probably PhD level. At the moment, my back of the envelope calculations suggest that we currently graduate 100 or less PhD students in high-end machine learning per year in Australia. I suspect about half of these get jobs overseas. Leaving 50 people per year to be shared amongst the businesses, government agencies and start-ups that need them. So, we fill the gaps with international workers where we can, but in a market where the US and China are offering much larger salaries. The international artificial intelligence specialists are all very welcome and we are grateful that they are willing to come to Australia to address this skills gap. But it does seem like a missed opportunity for Australians, where there are all these amazing career opportunities. And it still leaves us short of the talent we need to be globally competitive in this area of expertise. We need a massive investment in this high end specialisation, and a national campaign to attract people to this career. The demand is huge. I know of a post-doctoral fellow, three years out from their PhD, who just got offered a position at $700,000 per annum - a sure sign that there is a big gap between supply and demand.

8.??????The growing challenge of autonomous weapons and battle systems

People are quite rightly concerned about autonomous weapons (sometimes called 'killer robots'). In the public’s mind, this brings up images of Arnold Schwarzenegger in the movie The Terminator. But this is not what autonomous weapons are. They are any system that make their own decisions about which target to aim for. The weapon could be a missile, or a tank, or a drone. The target could be combatants, drones, planes, tanks, missiles or other physical assets.

  • Land mines are very simple autonomous systems (the decision is made by how heavy you are when you step on it). While anti-personnel land mines are banned under the Ottawa Convention, anti-vehicle and sea mines are still used.
  • Anti-missile defence systems already have some level of autonomy.
  • Artificial intelligence can already beat a human pilot in virtual dogfights between fighter jets.
  • Australia, along with the United States, China, and Russia, is developing 'loyal wingman drones' which are unmanned semi-autonomous or autonomous aircraft that support manned aircraft.
  • Israel recently became the world's first country to deploy a swarm of drones in combat. These swarms can only function effectively by using artificial intelligence, although humans are still currently the decision makers about what those drones do.

There are some great advantages and terrible risks of giving weapons the ability to make their own decisions about what to target:

Advantages include:

  • Reducing civilian casualties by programming weapons to abort a mission if civilians, or important civilian assets, are at risk once the weapon is closing in on a target. This would be a brilliant outcome – civilians are usually the biggest casualties of conflict and current weapons systems still seem to be inconsistent at targeting the right military targets. Any efforts to reduce civilian harm from warfare should be seriously considered.
  • Stopping soldier rage and atrocities. War and conflict are enormously damaging to the human psyche. Soldiers who have seen their friends and family killed and maimed can look to seek revenge on a civilian population. There are some very dark periods of human history when soldiers lost control of their emotions and went on rampages of rape, torture and murder. Removing more soldiers from the field of battle can only reduce this risk - and its obviously a lot better for the soldiers!
  • Battlefield supremacy. Artificial intelligence may be the only system that can intercept a missile, or manage a swarm of drones. The speed of artificial intelligence can help detect an enemy sooner. The ability to analyse large amounts of data can speed the process and quality of tactical decision making. Autonomous vehicles can undertake dangerous surveillance missions.

Disadvantages of giving weapons the ability to make their own decisions include:

  • They could simply be inaccurate and make lots of mistakes, exploding and killing the wrong people and things.
  • A psychopathic leader could instruct their army to produce weapons systems that systematically target every member of the population of the opposing side. Wholesale slaughter is much harder to do with a bunch of soldiers who might baulk at the task of systematically wiping out a civilian population. But not impossible. Unfortunately, instructions for wholesale slaughter of civilians have been carried out by armies in the recent past: the nuclear annihilation of Nagasaki and Hiroshima, the Chechen conflict and the genocide in Rwanda are just some horrible examples. The genocide in Rwanda sadly remains one of the highest kill rates in a war (~1,000,000 killed and up to 500,000 women raped in 100 days) and the weapons were bush knives and rifles. High-tech is not a requirement for genocide. Nevertheless, a fleet of genocidal drones would be a catastrophic use of artificial intelligence.
  • No accountability. Who is accountable if the robotic weapon makes a mistake? The General can claim a systems error. The manufacturer can claim operator error. I would argue our systems for accountability for atrocities in war are poor at the best of times. But even still, creating a process for putting accountability squarely at the feet of the commanders should be relatively simple. It is their responsibility to make sure their soldiers and equipment deliver the outcomes the commanders are seeking.

While we should definitely put in place appropriate international agreements and consequences so that these weapons systems are not used for genocide or other awful purposes, policing the use of these systems in advance is going to be difficult. And the appetite from the largest powers for controls over autonomy of weapons systems is low. The US National Security Commission on AI has concluded that AI enabled and autonomous weapons systems are required for military superiority, and can help reduce civilian misery from war. The Chinese People's Liberation Army are investing heavily in artificial intelligence weapons systems. President Putin is quoted as claiming that the country that dominates in artificial intelligence will 'become the ruler of the world' but perhaps this says more about his psyche than the potential of artificial intelligence.

Nuclear weapons systems can be detected because they are radioactive and radioactivity is very hard to hide. But software can be developed quietly and in secret and only be discovered when it is too late. Artificial intelligence is too powerful a technology to be ignored by any armed forces. A small fleet of drones can render a tank division harmless in hours. If you detects an opponent 5% faster than they do of you, you go home and they don't. The potential for battle superiority is too great and most defence forces are already investing in this technology now.

Nevertheless, there are efforts and campaigns related to the control of artificial intelligence in weapons systems.

  • The United Nations Convention on Conventional Weapons Group of Governmental Experts on Lethal Autonomous Weapons have been debating the matter for some time. This group is addressing Lethal Autonomous Weapons (LAWS) under the Convention on Certain Conventional Weapons (CCW) and Australia remains actively engaged on these matters. The current majority view is that existing international rules are sufficient to regulate the use of LAWs, but there are some countries who would prefer to go further and ban LAWs entirely. Such a ban is not supported by most of the countries that have the largest militaries.
  • There is also growing discussion amongst the members of the Missile Technology Control Regime (MTCR) about regulating the use of autonomous drone technology. The current guidelines limit the export of some military drones, although the USA under President Trump 'reinterpreted' the rules and exported them anyway. Sigh.
  • The International Committee of the Red Cross have a view that: 'unpredictable weapons' such as land mines and cluster bombs should be banned, no autonomous systems should be allowed to target humans, and all other autonomous weapons systems should be regulated to some degree.
  • The Campaign to Stop Killer Robots argues that all autonomous weapons must be banned, and some states have already endorsed a complete ban

We have existing arms control treaties for other horrible weapons including nuclear weapons, chemical weapons, biological weapons, cluster bombs and land mines. We have international processes for thinking through and placing agreed limits on the use of different weapons technologies. The challenge is to give these processes the teeth needed to make them effective. If we found out that a country was developing autonomous drones designed to kill only Caucasians (for example), what we be our response? Protests? Sanctions? Destruction of technological capabilities? Invasion?

It's all very tricky, but in my view, Australia should:

a)??????Use their role on on the UN and MTCR to create strong international controls and serious consequences of any weapons systems that:

  • target civilians (I wonder who should be in court right now if that law was enforced?),
  • have unpredictable outcomes (landmines and cluster bombs), or
  • have the potential to be weapons of mass destruction (autonomous chemical, biological, radiological, and nuclear weapons)

b)?????Assume that artificial intelligence is being included in all weapons systems in all armed forces, but regulate them to be focused on:

  • improved battle-scape management such as information retrieval and early warning systems
  • reduce information overload for combatants who can now capture much more information than they can process
  • autonomous systems for faster self-defence, such as missile defence systems and anti-drone defence systems.

c)??????Invest heavily in diplomacy, integrated economies, cultural understanding and exchanges, common science programs and shared understanding and agreements. The very best of all possible outcomes is no war at all. There are very few winners from any of the last few hundred wars that the world has experienced, and there is no reason to expect that a new one will be any different.

9.??????Ethics – the problem is people not technology

There is quite rightly a lot of intellectual energy into thinking through the ethical challenges of this new technology right now. There are valid concerns about:

  • The ability of this technology to manipulate the thinking of whole sections of society. For example, we seem to have a whole section of the United States population who believe that the most patriotic thing to do right now is to overthrow their own democratically elected government - and artificial intelligence has been used to target the messages that most appeal and influence this target group.
  • The ability to analyse our digital fingerprint from our online activity and data records and use that information against us
  • The ability to automate surveillance and use the information to take away our political, social and economic freedoms
  • Embedding bias in artificial intelligence systems by training on biased datasets or using biased analytical approaches.
  • Monopolisation of an industry by powerful artificial intelligence companies - like Facebook in social media and Google in search.
  • Technology companies that don't need a physical presence in a country can operate globally from anywhere, opening up opportunities to avoid paying tax, undermining the taxation base of nation states.
  • Killer robots as discussed above.

I think we need to divide these into two main issues:

Issue 1 – Poor standards

Some of these ethical problems arise because people without a deep understanding of the maths and software use generic tools available online to build artificial intelligence systems. In my view, these people mostly don't set out to:

  • create biased artificial intelligence
  • inadvertently share data to those they shouldn't
  • create surveillance systems that are then co-opted by bad actors for more insidious purposes.

Most of the time, they are just inexpert.

Unfortunately, those who get in early and dominate can set the standards for those who follow – and if the standards are crap then its crap systematically.

For these types of issues, we need to learn how to build new management and regulatory systems to stop people from being their own worse enemy. There is room in the market for people to develop a standards checking service and I suspect these services will emerge rapidly. Governments could promote these services by requiring certain standards to be maintained.

Issue 2 – Evil actors

This is a more insidious problem. The technology exists. It is moving fast. It is impossible for governments to keep up with the change. Bad actors will deploy the technology for their own interests at the expense of everyone else.

If you are a very selfish person, the temptation of this technology is to use it to manipulate people to do things that are for your best interests and against theirs. At the somewhat evil end of the scale is manipulating people’s fears to sell them products. At the extremely evil end is Big Brother type surveillance to punish the politically misaligned.

One of the worrying trends in this space is the leveraging of social media’s targeting algorithms to push messages that are opposed to public good messaging to a frightened public. We have seen this with the misinformation campaign around the causes of bushfires in Australia, the validity of the election results in the US and public health messaging around COVID globally. There are darker and more sinister examples, such as the massacres of Rohingya in Bangladesh, driven by a military-designed social media campaign.

Government agencies I have spoken to struggle for the tools to resist this overwhelming wave of misinformation that feeds on itself like a bushfire. I believe they need to learn the skills to do so, or we are in very grave danger indeed in the medium to long term.

In the end, the best defence against these more insidious actors are:

  • Strong democratic institutions,
  • High standards of behaviour by politicians and public servants to create of high levels of trust in governments,
  • A strong social contract between governments and their constituents
  • High standards of community education,
  • A well-resourced and independent media not beholden to interest groups,
  • Clever legislation that encourages freedom but not at the expense of deliberate misinformation, and
  • Global cooperation and striving for peace.

These are the only antidotes to bad actors because we can never police all of the selfish uses of any technology, let alone artificial intelligence. Bad actors will carry on regardless. Our defence needs to be systematic, not reactionary.

So anyway,...we obviously have a problem.

Trust in institutions and governments has been eroded by poor leadership, scandals, and corruption, and by a media looking to amplify the negative messaging. The rise of social media has lowered standards of fact-checking before information is pushed into a mass market, from a baseline that was already variable in the traditional media sector. Traditional media organisations must now be more aligned to the views of their customers in order to keep them because the lower cost of publishing information online has created thousands of competitors who can niche market their product. This puts a lot of the heavy lifting to counteract bad actors on education and legislation.

In Australia, I am somewhat hopeful because our standards of education are high and relatively consistent. We still have capacity for legislation to be developed and checked by a competent public service and parliament, although I have my concerns about the rising influence of powerful lobby groups and the declining technical capability of the public service in favour of large consulting companies.

In countries where education quality is not evenly distributed, I am much more fearful.

The best we can do as individuals is champion high-quality education, vote out corrupt and cheating politicians and governments as a matter of principle, and find a different way of funding quality journalism. Journalism is such an essential component of a functioning democracy, it was always crazy to fund it through advertising. We need a strong and independent public broadcaster that prioritises fearless, independent journalism and holds powerful institutions to account. To be honest, I would put more investment into this and less into BBC crime shows, cooking shows and panels.

Finally, the likes of Facebook, Google and other technology companies are almost too powerful for an individual government or country to take on. Bipartisanship within countries, fearless journalism and global cooperation will be required to keep the worse of their excesses under control.

10. The benefits from artificial intelligence are spread very unevenly

Artificial intelligence could and should contribute enormously to making our lives better. We could improve energy efficiency, reduce the costs of the delivery of government services, improve our mental health, improve the delivery of health services, reduce child sex trafficking, monitor and protect our environment and the list goes on and on.

But most artificial intelligence research and investment to date has been to better target advertising and increase sales. This has driven up the costs and it is hard for organisations seeking to improve human well-being to access the funds and capability needed to develop these new tools that would make all our lives better.

I'd love to see better mechanisms for deploying artificial intelligence to improve community well-being, protect and improve the environment, reduce human suffering, and to address what I think is still humanities biggest issue - climate change.

11.??????Australia’s innovation system needs an overhaul

Spending three years promoting innovation from inside an Australian university has been eye-opening. The intention for promoting innovation exists with all of the actors, but our structures and incentives are working against us all in my opinion.

From my experience, there seem to be few Australian businesses who have the budget and governance systems to properly invest in innovation. The comparison to the USA and Europe is depressing.

University driven innovation is dis-incentivised by government funding processes and by the international student market. Academics are punished by the Australian Research Council if they spend time on innovating with business instead of doing high-citation research. There is limited funding for universities to help their academics to develop their ideas into businesses, and so money is top sliced from other funding pools, and are minuscule as a result. Many universities want to own the majority of the equity in businesses that are created, leaving the academics to wonder what is in it for them. Driven by international rankings to attract international students, universities favour research performance over innovation, so the prospects for the promotion of innovators is poor. It is all worse now as universities feel the squeeze from declining international students - which is resulting in enormous and perhaps sometimes unbearable stress on academics and administrators.

Australia’s venture capital market is immature and small. It is woefully hard to attract investment for early stage businesses in Australia, and then for fast-tracking growth.

Government seems to vacillate between leaving innovation to the private sector, but wanting to interfere when it is politically expedient. They demand universities generate more business outcomes but don’t give them the means to achieve it.

It's all a bit of a mess. And probably why Australia performs poorly in terms of translating research from universities into new businesses. And perhaps partly explains why Australia ranks low on the economic complexity index and is falling precipitously on this index compared to other countries - now ranking below Kenya, Namibia, Kyrgyzstan, Philippines and Thailand.

The AIML partnership with state government was in my view a different way of thinking about how to encourage innovation that brings together industry, government and universities. The partnership had a number of elements, but critical to the functioning of the program was an ability of AIML to ‘fund’ small pilot projects and scoping studies with organisations to develop a roadmap of innovation that was understandable to both an external organisation and a technical expert.

The result has been fantastic – with great successes already by companies such as Pickstar and Rising Sun. The South Australian government has also used the capabilities of AIML to successfully attract a suite of Australian and international technology companies to base themselves in Adelaide – generating thousands of new, high paying jobs, and helping to rebrand Adelaide as a location for international, high-tech business (note: other states have also successfully used this strategy, notably Victoria in medical technology). AIML has been a relatively small investment by the South Australian government that has made a measurable difference to the economy of South Australia, delivering a brilliant return on investment.

My humble suggestions for improvement in Australia’s innovation ecosystem include:

  • Governments to establish a funding pool specifically for universities to spin out research and reward those universities who do it well
  • State governments to double down on leveraging their high-performing research centres into major drivers of business growth
  • Massively incentivise companies, particularly in the technology area, to invest in R&D
  • Board members to receive training by their associations on how to encourage innovation within companies.
  • Make it even easier for high-tech specialists to migrate to Australia
  • Incentivise international investment to establish fit-for-purpose investment funds in Australia and give them performance benchmarks to achieve to receive tax benefits
  • Require superannuation funds to invest ~1% of funds in high performing and proven innovation funds who invest in Australian innovation

Where to from here?

For Australia, the challenges ahead are to strengthen our institutional structures to thrive in an AI-driven economy. The national AI Roadmap and AI Action Plan are a great start but there is much more to do. We need to get some runs on the board and we need to think deeply about data as a sovereign asset and artificial intelligence as a key driver of the next economy. We have the data and the talent, we just need the imagination, courage and investment.

For me personally, I am taking some time to explore what the world has to offer. I am still doing some consulting work and am keen to run the coaching program I have established for early career researchers. I retain a link to AIML through my work on the Smarter Regions CRC bid, and a number of other larger strategic initiatives. I am also involved in a number of start-ups. But mostly, I am open to doing exciting things with interesting people on worthwhile projects.

I cannot thank enough the AIML team for giving me the opportunity to join them on their journey for a while. They are a remarkable group of people who work incredibly hard with amazing talent. In particular, I would like to thank the two Directors – Professor Anton van den Hengel and then Professor Simon Lucey. They both have a wild ambition for Australia to benefit hugely from the technology called artificial intelligence, and I enjoyed being stretched to help try and make this a reality. I am also very thankful for the people who I worked with or interacted who shared this vision including, and not limited to, two Premiers of South Australia – Jay Weatherill and Stephen Marshall who have shown bipartisan support for AIML, Rex Patrick – Senator for South Australia, Minister Pisoni and Minister Patterson, the Chief Scientist of South Australia – Professor Caroline McMillen, Andy Dunbar and the DIS team, Kim Scott, Gavin Artz and the whole DPTI team, the senior management of the University of Adelaide and the AIML academics, executive team, postdocs and students – too many to name here. As well, I enjoyed testing ideas and coming up with new dreams with Catriona Wallace and Evan Shellshear and everyone from the Australian AI Network.

Darren Oemcke

Hydra Consulting

3 年

Tour de force Paul. Fascinating read.

回复
Nikki Macor Heath

Tech + R&D Lawyer - Special Counsel at HWL Ebsworth | Intellectual Property | Privacy & Data | IT | Space | General Commercial

3 年

Great summary Paul, best of luck for your next challenge!

回复
Jon L.

Quality Control | Engineering | Results-orientated

3 年

AI crunching the numbers , ML machine learning , a new field for me , will read on .

回复
Lloyd Jacob Lopez

CEO HEX20 | Australia's Top 100 Innovators 2022 | CSIRO ON Alumni | Alumnus IIM Bangalore

3 年

Good Read, covered a lot of important issues. Thanks Paul Dalby

回复
Jason Dean

Digital Transformation, Process Improvement, Smart Technology

3 年

Great article, Jon Fisher thought you might find this interesting.

要查看或添加评论,请登录

Dr Paul Dalby PhD GIA (Affiliated)的更多文章

  • AI, Ethics, and the Future: Can the Arts and Humanities Save Us?

    AI, Ethics, and the Future: Can the Arts and Humanities Save Us?

    I was challenged to present 8 minutes on the above title to a forum hosted by the Australian Catholic University…

    1 条评论
  • States of Innovation: Israel and Flanders as case studies of success

    States of Innovation: Israel and Flanders as case studies of success

    I recently visited Israel, an amazing innovation nation, and also IMEC, the world's leading research centre for…

    11 条评论
  • Who invests in ARC Linkage?

    Who invests in ARC Linkage?

    The ARC Linkage Program is the flagship program of the Commonwealth Government to encourage collaboration and…

    11 条评论
  • Taking on a new role

    Taking on a new role

    This month I have taken on a new position as the Business Development Manager for the Australian Institute of Machine…

    24 条评论
  • Insights from the Smart Water Summit

    Insights from the Smart Water Summit

    The Smart Water Summit 2018 was held in Adelaide from 22 to 23 March, hosted by the Water Industry Alliance and ICE…

    4 条评论
  • Saving Science from Science Writing

    Saving Science from Science Writing

    Zoe Doubleday and Sean Connell have just published a paper in Trends in Ecology & Evolution that is a manifesto for…

    14 条评论
  • Wealth from a healthy environment III: Savanna Carbon

    Wealth from a healthy environment III: Savanna Carbon

    Recently, I had the privilege of meeting with the research team from the Darwin Centre for Bushfire Research. This…

    3 条评论
  • Wealth from a healthy environment II: Oyster reefs

    Wealth from a healthy environment II: Oyster reefs

    On 28th June 2016, the South Australian Minister for Environment, Ian Hunter, announced the construction of a new, four…

    4 条评论
  • Is this the most exciting time to be an Australian entrepreneur?

    Is this the most exciting time to be an Australian entrepreneur?

    When Malcolm Turnbull took over the reins as the Prime Minister of Australia, one of the more popular changes he…

    6 条评论
  • The critical role of education in our rural cities

    The critical role of education in our rural cities

    I recently had the great pleasure and honour of working with Brian Cunningham and the Rural City of Murray Bridge to…

    3 条评论

社区洞察

其他会员也浏览了