#LegalTechPills
  • Instituto de Conhecimento
  • Intellectual Property & IT

AI, Cacio e Pepe

By Helder Galvão on

It has become commonplace to claim that artificial intelligence will eliminate certain jobs — a sort of extermination of the labour market. This narrative is often fuelled by commercial interests and market strategies that thrive on such alarmism.

However, as with most things in life, the reality is rarely that extreme — neither entirely bleak nor wholly bright.

Change is inevitable. And with it comes a certain reconfiguration, particularly in the legal sector. Tasks that are highly operational, repetitive, and cognitively low-demand are likely to disappear or lose much of their relevance. Consider, for instance, the drafting of routine audit reports, the search for judicial precedents across various jurisdictions via specialised engines, or even the creation of clauses on specific topics — such as non-compete agreements between two parties — generated by LLMs like ChatGPT, Claude, Gemini or Grok.

But should we really fear the machines? Let’s look at two examples.

The first involves a judge in a city in northeastern Brazil, as reported by Migalhas. This judge used to issue around eighty decisions and rulings per month. Suddenly, that number jumped to over nine hundred. An audit uncovered serious flaws in the rulings and signs of improper use of artificial intelligence.

The judge’s eagerness to deliver justice at an exponential rate raised concerns due to the imbalance it created. After all, when correctly implemented, technology is an ally of justice. But when poorly applied or used without due consideration, it becomes a tool of injustice.

It’s not about volume, standardisation, shortcuts, or meeting targets. It’s about what technology can, in fact, replace — and where it should never interfere.

The second example comes from Cezar Taurion, former IBM executive and advocate of explainable AI. According to Taurion, we often hear interviews with Big Tech CEOs praising the massive time savings achieved through automatic code generation. Some even claim that software development will never be the same again.

But in practice, what is the real impact of these tools on a developer’s daily routine? There’s no doubt that the LLMs mentioned are highly useful in certain contexts — especially for generating simple code, rapid prototypes, testing ideas, or sketching system integrations, including those used in legaltech solutions.

However, as projects increase in complexity, the limitations of these tools become apparent. Taurion notes that as the code gets more complex, LLMs start to “forget” previously requested adjustments, leading to repeated rework. They often add unnecessary features — even when instructed not to — and tend to lose track of context in complex implementations. The time spent correcting these errors can easily negate the supposed time savings.

It’s not much different from conducting legal research. A simple query on limitation or prescription periods, for instance, might yield a basic and objective answer from an LLM — a kind of conceptual summary, with no real depth. For a quick consultation or a superficial check, that has value. But for a more complex legal opinion, with references to doctrine and case law, traditional legal research and human analysis remain indispensable.

The same applies to drafting a commercial lease agreement. For a basic, initial structure, the tool may suffice. But when something more robust and tailored is required — with specific clauses and conditions — the jurist’s involvement becomes essential. And, as in Taurion’s example, revision, refinement, and sometimes discarding parts of the AI-generated text will be necessary. Without constant human oversight, entire projects may be compromised, with precious time lost in the process.

Of course, LLMs are undeniably valuable for speeding up repetitive tasks, creating prototypes and quickly consulting concepts or syntax. But they do not replace the critical reasoning of a developer or legal professional. Anyone who risks sending AI-generated code or contracts straight to production or to a client without proper review or minimum technical rigour is likely not involved in complex projects — or simply does not care about the quality of their work or their professional reputation. Just ask the Brazilian judge…

At the end of the day, it’s just like the Italians treat their cuisine. One might opt for a well-known brand of spaghetti to save time when preparing the classic cacio e pepe. But the touch of the chef is essential — in selecting the finest ingredients, blending flavours and spices, and timing the cooking perfectly, so as not to overdo it and disappoint the diner. Just like with LLMs: give me the basics, and I’ll give you something unique. Prego!

EmailFacebookTwitterLinkedIn

Related Content