VOCABULARY

Pre-trained Transformer

A pre-trained transformer is a type of neural network model that has been previously trained on a large dataset and can be fine-tuned for various specific tasks. Transformers are known for their effectiveness in handling sequential data, particularly for NLP tasks.

The Significance of Pre-trained Transformers

These models, like BERT and GPT, have revolutionized NLP by providing a powerful framework for understanding language. They can be adapted for a range of applications like text classification, translation, and question-answering with relatively little additional training data.

Lakera LLM Security Playbook
Learn how to protect against the most common LLM vulnerabilities

Download this guide to delve into the most common LLM security risks and ways to mitigate them.

Related terms
Activate
untouchable mode.
Get started for free.

Lakera Guard protects your LLM applications from cybersecurity risks with a single line of code. Get started in minutes. Become stronger every day.

Join our Slack Community.

Several people are typing about AI/ML security. 
Come join us and 1000+ others in a chat that’s thoroughly SFW.