Help

"Artificial Intelligence" Ethics for Attorneys

What can AI ethically do for you?

  • Product Number: 2250075P01
  • CLE Credits, earn up to:
    1.5 substantive credits, 1.5 ethics credits CLE Credit Note
  • Print Brochure
  • Add to Favorites List

Choose Date/Location:

Also Available:
Ondemand Webcast
On Demand Webcast Add to Cart
Includes downloadable supporting materials. $145.00; Members $130.50; New Lawyers $72.50 Free for OnlinePass subscribers.
  • Product Description
  • Agenda & Materials
  • Faculty
  • Product Description

    Product Description

    We have enjoyed law practice automation for more than half a century, from early keyword searches of cases manually entered into databases to semi-automatic assembly of common contract clauses to expert-seeded predictive coding for the screening of documents for production in discovery. Since November 2022, the availability of ChatGPT 3.5 and other large language models (LLMs) to generate proposals of human-sounding text applying patterns found among billions of words of training text has attracted tens of millions of users. These users include attorneys, several of whom have been called out by courts for filing papers with machine-generated, “fake” case citations or arguments.

    Some courts and bar organizations have set down rules to address these among other inappropriate uses of generative artificial intelligence. More considered rules are being developed as risks are identified. GenAI-proposed answers that pass human review as “good enough” may turn out to be wrong, reflecting biases and other inadequacies of LLM training data that are opaque to the user.

    Legal ethics issues in law practice automation, such as confidentiality, have been amplified by GenAI through embedding and much enhanced retrievability that were not considered in the “proportionality” rules established just a few years ago. Availability through web services (on diverse terms) of foundation LLMs trained with information “scraped” from public-facing sources to which creators and individuals may raise proprietary or privacy claims—often customized or “fine-tuned” —leave tool providers, attorneys, and clients with limited ability to identify, much less resolve those issues.

    Attorneys learn from practitioners who have been involved in policy-development, including at the Board of Bar Overseers of the Massachusetts Supreme Judicial Court, their ethical responsibilities related to generative AI.

  • Agenda

    Agenda & Materials

    Please Note

    MCLE webcasts are delivered completely online, underscoring their convenience and appeal. There are no published print materials. All written materials are available electronically only. They are posted 24 hours prior to the program and can be accessed, downloaded, or printed from your computer.

  • Faculty

    Chair

    Stephen Y. Chow, Esq., Stephen Y. Chow, PC, Boston

    Faculty

    Warren E. Agin, Esq., Analytic Law LLC, Boston
    Paula M. Bagger, Esq., Law Office of Paula M Bagger LLC, Boston
TOP