Housing Technology interviewed AI specialists from Civica, Converse360, Made Tech, Mobysoft, NEC Software Solutions, Orlo and Riverside Group on the business imperatives, internal and external operations, pitfalls and advice, ethical questions and staff concerns associated with the adoption of AI by housing providers.
The business case for AI in housing
Glen Ocskó, head of local government at Made Tech, said, “We’re all talking about AI and machine learning but the data that’s needed to make it worthwhile is atrocious; the data is shallow, lacking substance and not shared because every housing provider keeps its data to itself.
“The only reason platforms like ChatGPT are quite useful is due to the sheer volumes of data that have been pumped into them. In housing, we don’t have that much data because people aren’t sharing non-identical data with each other to allow us to train AIs. All we can do is take a thin slice of data and put a superficial AI layer on top of it. The first thing we need to get right is the data that feeds and trains the AI.”
Trevor Hampton, director of housing solutions at NEC Software Solutions UK, said, “AI can have the biggest impact in income collection and asset management. In both areas, the earlier problems are identified, the more cost-effective it is to deal with them.
“For example, AI can help prevent tenants falling into arrears by spotting spikes in late or non-payments within specific categories. If a pattern is forming, housing providers can take a more joined up approach and provide support to a particular group of tenants. When applied to assets, it can help drive efficiencies and compliance across housing providers’ entire housing portfolio by predicting repairs, maintenance and safety checks.”
Alison Stock, group director for digital & technology at Riverside Group, said, “The business case for AI in housing is to improve the customer experience with chatbots or conversational AI assistants. Customers get almost immediate responses to simple queries without the need for a phone call, leaving housing staff free to deal with more complex queries.
“Riverside already provides some simple chatbots to help customers with various standard queries, through our Salesforce CRM application, with all of our chatbots having been developed by our customer services team itself, supported by our in-house IT staff.”
Paul Berry, head of product management at Civica, said, “Using AI as one pillar of a wider IT strategy creates a strong business case for AI. A practical starting point is identifying specific use-cases for quick wins; for example, enabling customer interactions through a chatbot is a low-cost and low-complexity option. And while that sort of customer-facing automation is a strong reason to use AI, using it to augment human decisions by providing deep analysis can be equally valuable.”
AI for internal operations
Karl Roberts, technical director at Converse360, said, “Accessing AI is becoming much easier for non-technical teams so housing providers can experiment with all sorts of new AI services. ChatGPT (a large language model [LLM] AI) and numerous other generative-AI services now provide user-friendly interfaces for creating a range of valuable new services. Within housing operations, AI can be used to create content, perform data analysis and trend spotting, detect fraud, produce reports and automate repetitive processes.”
Dominika Phillips-Blackburn, head of product (income) at Mobysoft, said, “For their internal operations, housing providers can use AI to leverage their existing datasets to improve tenant satisfaction, reduce arrears and streamline maintenance operations as well as forecast the future.”
Hardeep Johal, head of product and development at Orlo, said, “Some realistic examples of how AI can help housing providers with their internal processes include predictive maintenance and repairs, algorithmic allocation of resources such as staff, equipment, and supplies, fraud detection, chatbots and virtual admins, and data mining and predictive analytics.”
AI for external operations
Civica’s Berry said, “People use the term AI to describe a variety of underlying technologies and solutions. Examples of these range from low-intelligence robotic process automation (RPA) through to trained machine-learning algorithms making complex predictions.
“Increasingly common AI applications include the bulk processing of tenant data from external sources (such as universal credit), virtual assistance for customer service, predictive maintenance and arrears analytics. Some AI solutions focus on delivering greater efficiencies, thereby freeing housing staff to concentrate on higher value activities, while others provide better outcomes to human decisions by providing real-time insights.”
Made Tech’s Ocskó said, “We’re all frustrated with chatbots and the formulaic answers they provide. ChatGPT will take us to the next level where we can put lots of data in the background so it can perform natural language processing (NLP) and understand what someone is really asking rather than just giving a set of options to choose from; common customer queries such as ‘my tap is leaking, what are my options?’’ can easily be taken off the table with AI.
“If every housing provider could feed data into a communal pot (in an ideal world), then we could use machine learning to work out patterns for, say, pre-emptive repairs for customers. However, this would only work with mass learning and mass data and no single housing provider in the country would be able to give the amount of data needed to make this happen.”
Riverside’s Stock said, “Chatbots, automation and clever workflow design can help customers self-diagnose repairs and make rent payments more easily, generally helping to improve customer services. And automation can help drive data integrity and data quality, giving us a golden thread of accurate data, which will be essential as we develop more capability around predictive analytics and building information modelling (BIM).
“There are many ways in which IoT, machine learning and AI are either connected or overlap. IoT devices, such as smart thermostats, might contain devices that use machine learning and AI, operating in an interconnected cloud of sensors powered by AI in rule-based systems.”
NEC’s Hampton said, “Housing providers can use AI to identify tenants’ behaviour and tailor their support accordingly. For example, many tenants are making the choice between heating or eating right now and using intelligent IoT sensors to analyse how they are heating and ventilating their homes and the effect on their utility bills could be very helpful. The IoT data might show that they’re leaving too many windows and doors open during the day, making it more expensive to heat the property in the evening.”
Pitfalls to avoid
Made Tech’s Ocskó said, “AI is being hugely over-promised, insofar as it can apparently solve everything. It is a tool to be used, nothing more. Don’t listen to promises that you can reduce head count and cut budgets; that’s entirely the wrong approach to AI.”
Converse360’s Roberts said, “Implementing AI should be treated like all other IT projects; you can’t just install a service and leave it to run without constant monitoring. Set your vision, think big but start small, and plan carefully. Decide which use-cases can add the most value and could contribute immediately. Experiment and review the outcomes in days or weeks, then grow the successful elements and ‘fail fast’ those that don’t add value.
“In the early days of your AI projects, more supervision will be need but this will substantially reduce over time. Although AIs can train themselves and automatically improve their data, this approach can be problematic and dangerous; humans are needed to reinforce the training data and ensure the outputs are factual, consistent, unbiased and avoid controversial content and profanities.”
Mobysoft’s Phillips-Blackburn said, “Data quality is usually the biggest hurdle when implementing AI. Poor data generates poor predictions, and this is frequently the case when housing providers rush into implementing AI.
“It’s very difficult to build a trustworthy AI system that accurately reflects human judgement. Self-learning AI and ‘black box’ models often produce questionable results and suffer from a lack of human supervision during the training process. This is also the case for negotiating regulatory and ethical concerns because AI systems are often not designed to operate within regulatory boundaries or tested against the potential underlying biases within the training data.”
Orlo’s Johal said, “It’s wrong to assume that AI doesn’t require a level of training and learning because human resources are still needed. For example, chatbots require metabases to be fed with existing customer information for up-to-date FAQs, help pages and guides. Most importantly, don’t assume that AI can replace your contact centre. Yes, AI is great for customer service but it’s important not to lose the human touch.”
Ethical concerns
Civica’s Berry said, “AI can be subject to bias from the underlying data, the human programming of algorithms or the training of the machine. There are also concerns around privacy as the potential for large-scale data collection and analysis becomes easier. Automation can also raise accountability questions when there is no human intervention – who is responsible for mistakes?
“These concerns can be mitigated with careful design, for example, by removing or minimising the use of any sensitive data which could lead to such biases. Robust governance processes and continuous monitoring also provide assurance and, where necessary, models can be re-trained to ensure the right outcomes.
“Using AI to augment human decisions, rather than allowing complete autonomy, can mean that a human decides a final outcome so there can be more confidence of oversight. Furthermore, an open approach to sharing details of the technologies and how and why they are being used can provide reassurance; frameworks such as the algorithmic transparency standard can be adopted to provide a consistent approach and ensure clear information is available.”
Converse360’s Roberts said, “AI can work in a number of ways and can consume data from LLMs (such as ChatGPT) or it can be private, specific and self-contained data. The data LLMs are exposed to may not all be factual and when summarising data LLMs can ‘hallucinate’ and produce inaccurate information. When exposing AI engines to your customer data, you should be aware that there will be a lot of personal information that must remain private.”
NEC’s Hampton said, “The major ethical concern around AI is to do with unconscious bias. The fear is that if trained inaccurately, the AI could inadvertently discriminate against certain groups or individuals. Adopting a sector-wide approach to best practice and determining the right reasons for applying it will ensure AI stays within ethical boundaries and improves tenants’ lives. And because AI regulation is evolving slower than the AI technology itself, there needs to be a higher level of governance and sign-off than for traditional technologies.”
Riverside’s Stock said, “Technology innovators (and doomsayers) have highlighted the combined promise and perils of AI, with the latter including the AIs taking over and replacing us. But if designed well with carefully-chosen, simple use-cases, I think the opposite will be true and AI will usher in a new era of customer service.
“AI will mean that housing providers will need to skill up their employees for this ‘brave new world’ but most employers do this all the time as part of their day-to-day operations. For example, at Riverside, we have an ongoing focus on our colleagues’ digital capabilities, encouraging the use of a comprehensive e-learning portal, technology ‘spotlight sessions’ and IT ‘driving licences’.”
AI & data management strategies
Made Tech’s Ocskó said, “AI should accelerate data management strategies on a huge scale and open them up. If we’re going to collectively make use of AI, we can’t live in a world where we are hiding data away. Organisations need to anonymise their data and release it in such a way that AIs can learn from real-life situations.
“The problem with ‘data management’ is that it’s often misconstrued as ‘data protection’ and locking away data. It is really about how you use data to manage your services better, so opening it up in the right ways is key to making AI work effectively.”
Mobysoft’s Phillips-Blackburn said, “There is a trend to aggregate all data in one place to allow AI models to be developed on big data sets. This means consolidating data from disparate sources but also making sure that the data is clean. This is means that housing providers must take into account their AI plans when considering their wider data management strategies.”
NEC’s Hampton said, “There is so much hyperbole around AI that a less exciting but crucial aspect to it has largely been ignored and that’s how it has been steadily improving data management. Data quality is improving because AIs must be trained using clean and accurate data or they will make poor or wrong recommendations; better data quality, accessibility and cyber security are all beneficiaries of the increasing use of AI.”
Orlo’s Johal said, “AI can have a significant impact on housing providers’ data management strategies but it’s very important to ensure that AIs are implemented ethically and are bounded by appropriate data-governance policies to protect privacy and prevent bias.”
Is AI tactical or strategic?
Civica’s Berry said, “Tactical AI applications focus on specific tasks, problem solving and data-driven decision-making in limited contexts. Examples include AI-powered chatbots for repairs’ diagnostics, pattern-recognition algorithms for detecting defects to external fabric of properties using satellite, drone and Google Earth-type imagery and natural language processing (NLP) for sentiment analysis.
“Strategic AI applications focus on analysing complex systems, predicting future trends and helping with high-level decision-making. Examples include AI systems predicting a housing provider’s future spending based on building components using deterioration modelling and other contextual data, or even helping to define corporate strategy.”
Converse360’s Roberts said, “AI can be both tactical and strategic. Tactical AI might be using it to resolve a specific problem or for a particular use-case; it may be cost-effective and quick to deploy and it solves an immediate problem. A strategic view is needed if you want to start automating processes, integrating systems or digitising new services. For those strategic developments, there’s a lot more planning needed to really understand what outcomes you’re looking for – for example, we’re now seeing many housing providers trying to re-imagine how they can deliver services rather than just digitising what they’re already doing.”
Mobysoft’s Phillips-Blackburn said, “AI informs strategic decisions because it takes time and careful management to build models which are accurate, ethical and trusted. However, those same insights can be used to make short-term, tactical decisions.”
The (misplaced) threat to housing staff
Made Tech’s Ocskó said, “Housing staff shouldn’t feel threatened by AI. AI should liberate them from dealing with mundane, repetitive queries. Staff will be able to work on the interesting cases and complex scenarios that AIs are unprepared for and unsuited to.”
NEC’s Hampton said, “AI isn’t a threat to housing staff and shouldn’t be perceived as such. AI isn’t about removing staff, it’s about enabling them to have greater insights to make better decisions. To put it in context, the volume of quality data needed to train an AI isn’t nearly enough to perform even the simplest of tasks performed by a typical housing officer.”
Orlo’s Johal said, “AI can be daunting if not implemented correctly, whether that’s trying to do too much too soon or staff not being sufficiently trained to allow the AI to complement their roles. Above all, it’s important for senior teams to make sure their staff understand why AI will be beneficial rather than threatening.
“If done ethically, AI shouldn’t be seen as human replacement but an enhancement to day-to-day roles. For example, if 80 per cent of inbound enquiries are FAQs and repeated queries, the remaining 20 per cent needs a more hands-on approach, which is where customer-service staff can really excel.”
Riverside’s Stock said, “AI is not a threat to housing staff – for example, most housing providers routinely use AI as part of their cybersecurity capabilities. Some housing providers are already on their second generation of AIs, moving beyond simple chatbots and automated messages to use sentiment analysis to prioritise their most complex calls, matching the calls with the most appropriate person to tackle the problem there and then with the customer.
“Riverside implemented 8×8’s ‘contact centre as a service’ (CCaaS) solution last year and we’re now set up to incrementally turn on new capabilities as we go along, as well as use our Salesforce CRM software to generate ‘next best actions’ in real time for our call-centre advisors.”
Examples of AI in housing
Civica’s Berry said, “Good examples of AI in our sector include arrears analytics solutions delivering operational efficiencies and financial benefits through a combination of automation and human-decision augmentation, and using RPA to automate repetitive tasks such as universal credit reviews.”
Converse360’s Roberts said, “Many housing providers are using our ‘conversational AI’ platform to automate customer service and offer enhanced services outside business hours, with HMS data used to identify customers and provide personalised responses. That said, we’ve seen various implementations of basic chatbots that don’t use AI, can’t connect to corporate data and don’t integrate with contact centres; these siloed offerings don’t offer a great service.”
Orlo’s Johal said, “Bromford is a great example of AI in housing. After the same-day implementation of our chatbot, the housing provider saw over a quarter of its customers’ enquiries resolved without any need for a human agent, thereby saving time and reducing its annual costs by over £330,000.”
Made Tech’s Ocskó said, “There are so many companies out there claiming that AI will predict everything, but they only have access to some early AI tools – it just isn’t ready yet. AI will only be really useful when one of the big software companies can collate, manage and process the amount of collective data required for a meaningful resource.”
Housing Technology would like to thank Paul Berry (Civica), Karl Roberts (Converse360), Glen Ocskó (Made Tech), Dominika Phillips-Blackburn (Mobysoft), Trevor Hampton (NEC Software Solutions UK), Hardeep Johal (Orlo) and Alison Stock (Riverside Group) for their editorial contributions to this article.