AI tools can add real value to your working day but only if you use them safely. As a mortgage professional, you handle some of the most sensitive personal and financial data there is. Here’s what you should be doing every time you use an AI tool.
Before You Start
- Check the privacy settings of any AI tool you use and turn off data sharing or training modes. Most consumer AI tools have this option in settings
- Make sure you are using an approved AI tool. If in doubt, check with your compliance team before using it with anything client-related
- Confirm that the AI tool you are using is not a consumer tool (such as OpenAI, Gemini or Copilot). These are not designed for regulated client data
- Enable Multi-Factor Authentication (MFA) on any platform or system you use to access AI tools or client data
When Using AI Tools
- Never paste real client data into an AI tool. This includes names, addresses, dates of birth, financial information, or case details
- Anonymise information before entering it. Replace names with initials or generic references (e.g. “Client A”), remove specific financial figures where possible, and strip out any identifying details
- Never upload fact finds, bank statements, payslips, or ID documents to a consumer AI tool
- Never use AI to rewrite sensitive client communications using real client data
- Apply a simple rule: if you wouldn’t want the information made public, don’t enter it into an AI tool
Passwords and Access
- Use a unique, strong password for every system. Never reuse passwords across platforms
- Use a password manager to store credentials securely
- Avoid predictable password variations (e.g. Broker123!, Broker124!)
- Review who has access to your systems regularly and remove access immediately when someone leaves your business
Everyday Habits
- Lock your screen every time you leave your desk (Windows: Windows Key + L / Mac: Command + Control + Q)
- Check AI-generated outputs before using them. Always review and verify anything an AI tool produces before sharing it with a client or using it in advice
- Don’t assume AI is always right. AI tools can generate plausible but incorrect information. If something doesn’t look right, check it against a reliable source
- Keep a record of how you have used AI in your business. If questioned by a compliance auditor, you should be able to demonstrate how and where AI has been used in your processes
- Don’t rely on AI for regulated advice decisions. AI can support your process but the advice, and the responsibility, remains yours
- Be cautious about what you copy and paste from AI outputs into client-facing documents. Always sense-check for accuracy, tone, and compliance
- Log out of AI tools when you are finished. Don’t leave sessions open on shared or unattended devices
- Stay up to date with the terms and conditions of any AI tool you use. Providers update their data and privacy policies regularly and the details matter
A Final Reminder
AI tools used responsibly can save time and improve your service. But the safe handling of your clients’ data starts with you. When in doubt, leave it out.