News

Industries

Companies

Jobs

Events

People

Video

Audio

Galleries

Submit content

My Account

Advertise with us

5 questions every in-house counsel should ask before implementing AI

There is an increased expectation for businesses to use AI to increase efficiency, improve customer experiences and reduce costs.
Image source: Jakub Jirsak –
Image source: Jakub Jirsak – 123RF.com

Media articles and workplace conversations are consistently endorsing its use to “stay ahead of the competition” and creating anxiety that if your business is not using AI, it will be left behind.

Consequently, the majority of businesses are under significant pressure and have either done so already or are on track to implement an AI tool imminently.

The rush to board the AI train places in-house counsel with a new and difficult task of having to consider the potential legal, ethical and reputational consequences of AI tools.

This task is complicated by the fact that this is largely uncharted territory: in-house counsel does not have the benefit of being guided by case law or legislation since few cases dealing with AI exist and South Africa (like the majority of countries) does not have legislation that is specific to AI.

To overcome this challenge, in-house counsel must ask critical questions before implementing an AI tool. Below are five questions to guide this process.

What data is the proposed AI tool using for training?

AI tools are reliant on the quality of their training data and any inaccuracies in this data could adversely impact their performance.

This data can be obtained by AI vendors internally or externally, but its use may often require prior authorisation. Failing to obtain this authorisation may expose your business to significant risk and reputational harm.

If your business is considering sourcing its data internally, which is common for AI being used for specific projects, consider questions such as, “Who owns the rights to this data?”; “Could its use breach client confidentiality or expose commercially sensitive information?”; and “Will the AI vendor use the businesses’ data for other purposes?”

What safeguards does the AI vendor have in place?

If an AI vendor is using internal business data that is confidential or commercially sensitive, it is important to ask what measures the vendor has in place to secure and protect this information.

Similarly, it is important to ensure that the vendor has a protocol in place for dealing with security breaches, including giving the business immediate notice of a security breach.

How accurate is the output?

AI hallucinations pose a significant risk and undermine any growth that a business is seeking to achieve by implementing an AI tool.

To mitigate this risk, in-house counsel should ensure that the AI tool has undergone rigorous testing to ensure its accuracy.

In-house counsel should ask for evidence reflecting the accuracy rate of the AI tool and should ensure the business does its own testing prior to any implementation.

What legislation is applicable?

Prior to implementing an AI tool, in-house counsel should consider in which jurisdiction the AI tool will be used and which legislation is therefore applicable. Similarly, in-house counsel must consider whether the AI tool’s purpose is legal and ethical.

Currently, South Africa does not have legislation that specifically governs the use of AI, but there is other existing legislation that may (depending on the particular AI tool) regulate its use and would need to be complied with, such as PoPIA, the CPA and the Competition Act.

Is the business prepared if something goes wrong?

In-house counsel should ensure that there are safeguards and protocols in place to mitigate any ethical, legal, compliance or reputational issues, which may arise from the use of the AI tool.

In-house counsel should also stay informed of regulatory changes, particularly in the current environment where AI regulations are in their preliminary stages.

Lastly, be transparent and ethical. If an AI tool is being used in the business, in-house counsel must ensure that all employees understand the AI tool’s capabilities and limitations, as well as the implications of using the tool for an unintended purpose.

If clients or customers need to be informed of AI use, this information must be disclosed as soon as possible.

By in-house counsel addressing these preliminary questions, businesses can integrate AI tools into their businesses in a responsible and ethical manner.

About Melissa Steele

Melissa Steele is a Director at Nortons Inc.
More news
Let's do Biz