AI in Housing Enforcement – Is It Possible?

AI in Housing Enforcement – Is It Possible?

How will artificial intelligence (AI) function in housing enforcement? Shaun Moss, Director of Surrey Property Licensing, explores its potential applications, the legal issues it raises, as well as the anticipated benefits and challenges it may present.?

First, what is artificial intelligence??

The Alan Turing Institute, which specialises in data science and artificial intelligence, acknowledges the absence of a universally accepted definition of AI.?

According to global tech company IBM, AI refers to any technology that enables computers to simulate human intelligence and problem-solving capabilities. Whether independently or in conjunction with other technologies like sensors, geolocation or robotics, AI can perform tasks that typically require human intelligence.?

AI is built on the development of algorithms modelled after the decision-making processes of the human brain. These algorithms can learn from the data provided, enabling them not only to make decisions but also to predict outcomes and formulate theories.?

AI can be categorised into three main forms:?

  1. Artificial intelligence (AI): Refers to machines that can learn, generalise, or infer meaning from input, thereby replicating or surpassing human performance. The term AI can also broadly describe a machine’s ability to perform repetitive tasks autonomously.?
  2. Machine learning (ML): Involves algorithms that improve their predictive or decision-making capabilities by leveraging new data. ML, a widely utilised form of AI, has driven innovations such as speech recognition and fraud detection.?
  3. Advanced data analytics (ADA): Utilises specialised knowledge and techniques beyond traditional business intelligence to extract insights and provide recommendations from complex data. Techniques range from data visualisation and complex linear models to language analytics.?

These three forms of computing are often interconnected. Analysts frequently combine ADA with AI or ML to achieve optimal outcomes.?

How is AI already commonly being used???

Online companies like Google, Netflix and many others have been utilising AI for quite some time. One application is that of suggesting products and services you might be interested in, based on your usage history. Search engines use the vast amount of data provided by users to deliver relevant search results.?

You may have heard of ChatGPT or Gemini. These programs can write about various topics or answer questions in a human-like manner.?

Certain AI applications can detect fake news and disinformation by mining social media for sensational or alarming words and identifying authoritative online sources.?

In the property sector, some estate agents are now using AI to create sales and rental listings and assist with customer service. AI also has applications in the social sector, such as streamlining maintenance tasks. Additionally, it has the ability to facilitate housing allocation.?

The new version of The Noise App, produced by RHE Global, now draws upon AI to help identify and screen out foul and abusive content.?

Next let’s look at the uses of AI in housing enforcement?

Truth be told, there is not much to see at the moment!?

But we can perhaps learn something from what the police are starting to do in law enforcement. Some police forces are using AI together with facial recognition to identify persons of interest.??

They are also starting to explore the use of AI in 999 call handling. Here it can help triage calls, establishing which need the most urgent response, even working out what sort of response is needed.?

Potentially AI can be used to investigate and solve crimes. It can sort through vast amounts of data or evidence much faster than humans ever can. It can spot links and patterns and could help to identify the offenders.?

It is said that AI can be used to predict when and where crimes might take place before they happen. Researchers from the University of Chicago have demonstrated AI that can predict crime a week in advance at an accuracy rate of 90%. If this seems like something from a sci-fi film, it’s not anymore!?

Now let us look at what legislation affects the use of AI?

AI is pretty new to the likes of you and me, and it is fair to say that legislation around the world is struggling to keep up too.?

The EU has just passed the world’s first AI Act. Interestingly this is based on the capacity of each particular type of AI to cause harm. The higher the risk, the more tightly that type of AI will be regulated.?

In stark contrast, the UK Government has adopted a light-touch, principle-based, ‘pro-innovation’ approach to AI regulation, with a non-statutory, sector-based framework. This is underpinned by a number of principles.??

Rishi Sunak, whilst prime minister, endorsed this approach at the AI Safety Summit in November 2023, stating: “The UK’s answer is not to rush to regulate.” Furthermore, in response to a white paper on the subject, the UK Government maintains that such a law should not be brought forward until the technology matures, but it will probably come in due course.?

The Information Commissioner’s Office reports that the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 do not specifically mention AI. But they have confirmed that data protection laws do apply to AI in the following circumstances:?

  • When personal data is used to train an AI model?

  • When personal data is used to test an AI model?

  • On deployment, when personal data is used or created to make decisions about individuals.?

The use of AI is certainly covered by some existing laws, for example, discrimination and equality laws, laws relating to financial services and – in particular – data protection laws.?

So, it is likely that different sectors and their regulators will develop their own rules for using AI; for example, in policing, the National Police Chiefs’ Council has produced a Covenant for Using Artificial Intelligence in Policing.?

What are the possible risks??

AI can get it wrong – very wrong in some cases.?

The First-tier Tribunal (General Regulatory Chamber) will hear a Freedom of Information Act (FOIA) appeal against the Department for Work and Pensions (DWP) for its refusal to disclose information about its use of AI to detect fraud and errors in the benefits system. The appellant requested disclosure under FOIA of any data protection impact assessments prepared by the DWP’s Integrated Risk and Intelligence Service (IRIS) concerning the use of profiling, machine learning, or artificial intelligence. The DWP initially refused the request but, after an internal review, decided to release redacted copies of the requested information. The remaining information was withheld on the grounds that disclosure would prejudice the prevention and detection of crime. This appeal is believed to be one of the first instances of a major government department’s use of AI being scrutinised under FOIA.?

In 2021 AI cost US property marketplace Zillow (the equivalent of Rightmove in the UK) $300 million. And it resulted in 2,000 people losing their jobs – all because an AI tool used by one of its divisions valued and bought 7,000 homes, many for much more than they were actually worth.?

Issues have been identified with AI and discrimination. It has been suggested that it can be racist.?

Not only can AI get it wrong, it can distribute that defective information quickly and easily. This could be done intentionally by malicious parties to spread fake information. The concept of GIGO – garbage in, garbage out – is very relevant here.?

There is a risk that AI-based systems could be gamed. As time goes by, other parties may become aware of how an algorithm works and figure out how to defeat it.?

And there is definitely a data protection issue with AI. Can you be sure the AI you might use is supplying data you are entitled to hold and use? Are you sharing data with others – or are they sharing it with you – in breach of data protection laws, perhaps inadvertently??

Data could be leaked, stolen or hacked, and there are possible legal issues involved with AI use or misuse, even if accidental. Individuals could be wrongly prosecuted or convicted. It could leave users open to criminal action or civil claims, for example, defamation or breach of intellectual property rights. There have already been court cases, particularly in the US, where AI was supposedly responsible.?

Some people may question whether using AI is ethical, or if it is truly responsible.?

Some principles to follow when using AI .?.?.? or considering the use of AI?

As AI is so new, there is very little to work with from a housing perspective, but we might be able to learn something from what is being done in policing. In its Covenant for Using Artificial Intelligence in Policing, the National Police Chiefs’ Council outlines six principles as a guide.?

The covenant states the use of AI should be:?

  1. Lawful?
  2. Transparent?
  3. Explainable?
  4. Responsible?
  5. Accountable?
  6. And finally, it should be robust.?

How could AI be of use to us in the future??

In general terms there are several likely benefits of AI from a housing enforcement point of view, namely:?

  • To automate the collection and processing of housing data and housing enforcement data (Data on enforcement can be particularly difficult to come by. It could make more data available, which has higher accuracy to inform our work.)?

  • To help automate tasks like paperwork, reports or issuing notices and enforcement actions?

  • To improve efficiency and save money from departmental budgets (e.g. if AI tackles more routine admin, resources could be diverted to other things like physical inspections.)?

  • To improve productivity (PwC, a professional services network, found productivity growth was almost five times as rapid in parts of the economy where AI penetration was highest than in sectors with less exposure.)?

  • And it could actually get you a pay rise! (PwC also found that, on average, UK employers were willing to pay a 14% wage premium for jobs that require AI skills.)?

What about some practical ways AI could be used in housing enforcement??

It’s incredibly early days for all this, but here are a few possibilities.?

AI could be used to sift through data to find breaches of housing law or even possible breaches of housing law. For example, AI could scan the information from HMO or selective licensing applications and automatically draw in data from other sources such as EPC, Land Registry or criminal records. Information supplied by the applicant could be validated accordingly.?

It could also offer the opportunity to conduct risk profiling, helping to identify landlords who are more likely to breach housing law, as well as those who already have. AI could potentially triage any violations – real or suspected – it found, then rank them in priority order for investigation and enforcement. It could perhaps do this based on the prospects of a successful outcome – or of a successful conviction. How useful would that be!?

In summary – some points to bear in mind?

To achieve the right balance between the benefits and risks of AI within regulation, close collaboration with members of the public, the Government, and private sector partners is essential. Councils should focus on creating a clear value proposition, along with ethical use principles and intervention criteria for AI’s application across all regulatory activities. There is a significant opportunity to develop ethical frameworks and regulations proactively.?

We need to be aware of the options that AI could open up in housing enforcement and appreciate the risks that come with the use of AI.?

Although there is not much to learn from housing enforcement’s current use of AI, other sectors – particularly policing – are more advanced in both possibilities and risks. We should perhaps look to learn from them, and their successes and failures.?

Remember: This is an extremely fast-developing and fast-changing technology. It is important to keep a close eye on what is going on.?

And lastly, if you think we don’t need AI, or it’s a fad that won’t last, you’re probably wrong. You could be left behind. Other people will be looking to take advantage of the potential.??

Don’t be an AI dinosaur!??

Shaun Moss, Director of Surrey Property Licensing?

Robert Halford

MD and Founder of RHE Global Ltd

3 个月

Useful piece on AI

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了