Generative Artificial Intelligence (AI) can be a useful tool to distill information from a variety of sources (including the Internet) to help find solutions to a problem and save time. There are, however, risks for Financial Institutions in using this technology, including uncertainty about who owns the AI-created content and security/privacy concerns with inputting proprietary company information or sensitive information about an employee, client, customer, etc., when interacting with the tool. Additionally, the accuracy of the content created by these technologies must be scrutinized, as the information may be outdated, misleading, or— in some cases—fabricated.
In addition, Financial Institutions are subject to the Gramm-Leach-Bliley Act and other regulations, which prohibit the release of nonpublic personal information and may include individual items of information and lists of information. For example, nonpublic personal information may include names, addresses, phone numbers, social security numbers, income, credit scores, account numbers, user names, and passwords.
Employees must discuss with their direct supervisor or manager before using generative AI while performing work. Company email addresses, credentials, or phone numbers should not be used to create an account with these technologies. No company, client, or personally identifiable data of any kind may be submitted (copied, typed, etc.) into these platforms.
Be sure to follow your institution's policy regarding the use of AI.