NEWS 5 An insurance company declined coverage for its policyholder, a landscaper, in a negligence claim involving the installation of an inground trampoline. The main issue on appeal was determining if installing a trampoline falls under the definition of “landscaping” as outlined in the policy. However, he mentioned that the altercation led him to contemplate the potential advantages of utilizing generative AI programs such as OpenAI’s ChatGPT, Google’s Gemini, and Anthropic’s Claude in assisting courts with interpreting the ordinary meaning of words in documents like insurance contracts. Newsom said he spent hours researching how “landscaping” is ordinarily understood, and that in a “fit of frustration” after consulting dictionaries, he tasked a clerk with asking ChatGPT just that. The resulting definition was sensible and “less nutty than I had feared,” prompting Newsom to then ask ChatGPT if installing an in-ground trampoline was “landscaping.” ChatGPT responded yes. “Landscaping involves altering the visible features of an outdoor area for aesthetic or practical purposes, and adding an in-ground trampoline would modify the appearance and function of the space,” ChatGPT said. Newsom said he “wholeheartedly” agreed with Chief U.S. Supreme Court Justice John Roberts’ assessment in his report on 31st December that the use of AI in the legal profession requires “caution and humility.” “Importantly, though, I also agree with what I take to be the report’s assumption that AI is here to stay,” Newsom wrote. Now, it seems to me, is the time to figure out how to use it profitably and responsibly.” The viewpoint was expressed as courts nationwide deal with the swift emergence of AI programs and consider whether regulations should be implemented on their usage by legal professionals. Importantly, though, I also agree with what I take to be the report’s assumption that AI is here to stay. Now, it seems to me, is the time to figure out how to use it profitably and responsibly.
RkJQdWJsaXNoZXIy Mjk3Mzkz