For the first dozen general qiestion prompts I tried, GPT4 gave similar answers to what I had seen from GPT3.5 which was a bit unexciting. I suppose if I used sketches or if I were a more advanced prompt engineer the difference might be bigger.
I’m a true solo attorney leveraging technology to keep overhead low to keep my prices accessible. I also don’t bill by the hour, so efficiency matters. I’m thrilled at what GPT-4 can do to help me complete work expediently. GPT-3.5 was already getting me to 70% usable material. It can’t replace legal research yet since it hasn’t been trained on it, but it will get there looking at exponential growth of generative AI. I know I’m only scratching the surface in my main line of work.
I wonder whether our intellectual property laws will catch up to this quickly changing AI landscape. Google is rolling out a version of this to work in the background for email and docs. When it’s powering work in the background and we aren’t signing into an AI platform, will that work be copyrightable? Right now, the US Copyright Office, I think correctly in light of existing laws, said generative AI created content is not copyrightable, such as when an animal takes a photograph using your camera.
As a result, I’ve been cautioning clients using generative AI who want to have IP interests in the output. There’s a lot of gray area right now.
Thanks Matt. I'm curious about your professional opinion on the overall relevance of the OpenAI claim that GPT-4 now passes a bar exam at the top 10% of test-takers?
Great question! I’ve been advocating for elimination of the bar exam for years since it has nothing to do with practicing law and has historical roots in acting as a barrier to entry into the profession for marginalized groups akin to how certain voting laws keep marginalized groups from voting.
Personally, that machine learning can pass the bar exam is very exciting for me in light of its ability to give me legal superpowers to help more people at scale.
The general consensus of the most forward thinking legal professionals actually using AI agree that AI is not good at practicing law since it cannot properly cite to or interpret case law, and it’s prone to hallucinations where getting the facts correct is of the upmost importance. I’m just coming off of one of the largest legaltech conferences where this was heavily discussed.
This is really all just a longwinded way of saying that I think this proves the bar exam is bullshit.
For the first dozen general qiestion prompts I tried, GPT4 gave similar answers to what I had seen from GPT3.5 which was a bit unexciting. I suppose if I used sketches or if I were a more advanced prompt engineer the difference might be bigger.
Give some of the logic puzzles that it failed in the past a try? Or a coding problem?
I’m a true solo attorney leveraging technology to keep overhead low to keep my prices accessible. I also don’t bill by the hour, so efficiency matters. I’m thrilled at what GPT-4 can do to help me complete work expediently. GPT-3.5 was already getting me to 70% usable material. It can’t replace legal research yet since it hasn’t been trained on it, but it will get there looking at exponential growth of generative AI. I know I’m only scratching the surface in my main line of work.
I wonder whether our intellectual property laws will catch up to this quickly changing AI landscape. Google is rolling out a version of this to work in the background for email and docs. When it’s powering work in the background and we aren’t signing into an AI platform, will that work be copyrightable? Right now, the US Copyright Office, I think correctly in light of existing laws, said generative AI created content is not copyrightable, such as when an animal takes a photograph using your camera.
As a result, I’ve been cautioning clients using generative AI who want to have IP interests in the output. There’s a lot of gray area right now.
Thanks Matt. I'm curious about your professional opinion on the overall relevance of the OpenAI claim that GPT-4 now passes a bar exam at the top 10% of test-takers?
Great question! I’ve been advocating for elimination of the bar exam for years since it has nothing to do with practicing law and has historical roots in acting as a barrier to entry into the profession for marginalized groups akin to how certain voting laws keep marginalized groups from voting.
Personally, that machine learning can pass the bar exam is very exciting for me in light of its ability to give me legal superpowers to help more people at scale.
The general consensus of the most forward thinking legal professionals actually using AI agree that AI is not good at practicing law since it cannot properly cite to or interpret case law, and it’s prone to hallucinations where getting the facts correct is of the upmost importance. I’m just coming off of one of the largest legaltech conferences where this was heavily discussed.
This is really all just a longwinded way of saying that I think this proves the bar exam is bullshit.
Looks like conversion rates to paying ChatGPT subscribers is about to go up: https://twitter.com/jradoff/status/1635966597671034882