The idea of outsourcing all of the boring parts of your work as a social housing landlord to AI is likely appealing. However, using AI is not without its risks, and it is worth figuring out whether AI can help you be more compliant, or if it is opening the door to more problems.
How can AI help social housing landlords?
With the growing need to improve safety standards, embracing advanced technology might be the secret to beating the tight new deadlines.
Monitors and detectors can be AI-integrated now to improve the detection of criminal activity or the risk of fires and mould by monitoring for movement, temperature, or humidity. They work by establishing an Internet of Things (IOT) to create a safety net within your property.
AI can also be useful for providing summaries of reports and data that will help with filing your own reports. Rather than having to go through multiple documents and find the key information, there are AI programmes that can pull the information you need and present it in a clear format.
It is best to edit AI-generated writing, though, to make sure that it is accurate, as Large Language Models are known to hallucinate and make up data or information that doesn’t exist. Using AI to speed up the process will certainly help you keep within filing deadlines and ensure a level of cohesion across multiple documents.
Chatbots and virtual assistants can also help you deal with tenants outside office hours and free up staff. These should be used carefully, though, so as not to alienate your tenants.
What are the downsides of AI use for social housing landlords?
If talking about AI sensors made you instinctively glance at the budget, you are one step ahead of us. There is a cost associated with AI implementation, and non-AI-powered solutions are often cheaper.
However, the arrival of Awaab’s Law has fundamentally changed the level of responsiveness that you are expected to maintain, and it is worth questioning how you will keep pace with the new guidelines if you are not aware of problems forming.
There are also some security and privacy concerns with the use of AI. It would be irresponsible to include confidential data in unsecured AI programmes, as there is no guarantee that the programme will not provide other users with that data. Regardless of how secure they claim to be, there remain concerns about publicly available, free AI programmes, as there have been notable data breaches that cannot be fully explained.
There are strict guidelines to follow when using AI to ensure the security of the data that you input and to protect your tenants. It is also worth noting that decisions should not be taken purely based on AI advice. Training datasets are often unrepresentative and could give you advice that may lead to discrimination under the Equality Act 2010.
Before taking any action, it is worth seeking professional legal advice. We are on hand to help you determine whether AI would work for you. We can help you implement AI in a fully compliant way if you decide that it will help you be a better social housing landlord.
Don’t let AI adoption lead you into non-compliance. Speak to our team today!
The content of this article is for general information only. It is not, and should not be taken as, legal advice. If you require any further information in relation to this article please contact the author in the first instance. Law covered as at July 2025.