As the opportunities for artificial intelligence (AI) to deliver innovation in the housing sector continue to grow, many housing providers are asking how they can embrace AI in a compliant matter.
To assist in effective and compliant use of AI, organisations can adopt a broad, cross-departmental approach. The speed and extent to which a housing provider wants to embrace AI isn’t just an IT department issue but also a cultural and strategic one which is likely being considered all the way up to board level.
AI everywhere?
At a time of competing priorities and funds, housing providers are unlikely to be able to justify AI at every possible opportunity and need questions such as: what is the problem that I’m trying to solve; what does AI add; what do I want to use AI for; and what are my strategic priorities? Organisations should also be considering internal training so that the potential risks, such as confidentiality when using Open AI, are understood. Some other potential risks are considered below.
The need for planning and coordination shouldn’t deter housing providers from adopting AI. An effective AI strategy can assist in ensuring this is done in a compliant manner, for example, answering key questions such as:
- Which departments will be affected by this feature?
- Does it require a dedicated team for monitoring, and what training is needed?
- Which department will be responsible for the information generated by this feature?
- What are the possible use-cases for the information generated?
- Is personal data being collected and/or processed, and have we confirmed compliance with UK GDPR?
- Are we analysing the incoming information? If so, which department will benefit most from this analysis?
AI has the potential to enhance many aspects of housing providers’ operations. For example, some have found AI particularly helpful in streamlining their tenant interactions and complaint procedures. AI is allowing housing providers to use increasingly innovative systems, such as:
- Optimising maintenance and repairs: AI modelling and analysis can help housing providers to forecast and identify maintenance patterns, allowing them to resolve problems before they become critical.
- Increasing operational efficiency: AI can automate routine tasks such as processing tenancy applications and collecting rent, freeing up staff to focus on more strategic activities. It can also be used to predict when residents may be experiencing financial difficulties, allowing housing providers to engage with those residents earlier.
- Document management: Managing and sorting documents, such as leases, agreements and maintenance records, can be challenging. AI allows tenants to access and manage their documents through dedicated portals, improving access and reducing wait times. Resident portals can also be personalised offering more relevant information.
The importance of data
When integrating and upgrading systems, it’s important to consider your existing systems. Most (if not all) housing providers still have legacy systems storing vast quantities of historical data. Data is key to all AI models – bad data in = bad data out. Some legacy systems may have compatibility issues and/or the data collected could be fragmented or inconsistent. Housing providers will benefit from understanding the impact of data quality on their plans, and a data audit can help assess the quality and integrity of an organisation’s data.
AIs are trained on data, and although it is difficult to gather and adjust data to eliminate bias, there are some practical steps to mitigate this, such as understanding the data that a model is trained on (to allow an assessment of potential bias and ensuring human oversight). This will also help with transparency/explainability compliance; can you understand how/why the AI model algorithms reached a particular outcome or output? These considerations apply not only at the point of adoption but throughout an AI model’s lifecycle.
Data protection policies
At the heart of any AI system is data, whether freshly generated (which may require an update to your privacy policy and/or informed consent in order to comply with UK GDPR) or gathered from existing information (in which case UK GDPR regulations require that your processing is compatible with the reason you originally collected it).
Before implementing AI systems, it is important to understand how the data will be collected, processed and the extent to which there will be any sharing or additional use of that data. Housing providers must therefore have robust data protection policies that not only comply with GDPR but are also capable of adapting to the complexities of AI.
With many AI models relying on using data as part of their own self-learning, it is key for housing providers to ensure that this is understood and considered in the context of data protection. Similarly, who owns the AI-generated data must be considered – if the AI provider won’t benefit from AI’s outputs (typically aggregated/anonymised) then this might have an impact on a number of matters include the price.
Risk management
As the potential roles for AI develop and expand, so too does the potential impact of service failures. In instances where AI is involved in potentially critical services, it’s important for housing providers to be aware of this risk and have mitigation procedures in place, whether as part of an AI policy, DR and business continuity policy or otherwise.
The growth and development of AI has huge potential for housing providers. However, its implementation should be coupled with a conscious consideration of key potential risks such as legacy systems, underlying bias, transparency and compliance with data protection. In addition, it should be approached with a strategic mindset that involves cross-departmental collaboration and accountability and aligning AI policies with the organisation’s core values.
Joanna Bouloux is a partner at Devonshires.