title: 20240227-huggingface-nlp date: 2024-02-27 tags:

  • course
  • nlp updated: 2024-02-27 up:
  • "[[nlp]]"

transformers models

NLP

challenge

  • 人可以很快知道詞相似

Transformers, what can they do?

Working with pipelines

  • https://huggingface.co/models
from transformers import pipeline

classifier = pipeline("sentiment-analysis")
classifier("I've been waiting for a HuggingFace course my whole life.")
# [{'label': 'POSITIVE', 'score': 0.9598047137260437}]

 available pipelines

Zero-shot classification

不需要pretrain來進行分類

from transformers import pipeline

classifier = pipeline("zero-shot-classification")
classifier(
    "This is a course about the Transformers library",
    candidate_labels=["education", "politics", "business"],
)
#{'sequence': 'This is a course about the Transformers library',
# 'labels': ['education', 'business', 'politics'],
# 'scores': [0.8445963859558105, 0.111976258456707, 0.043427448719739914]}

Text generation

from transformers import pipeline

generator = pipeline("text-generation")
generator("In this course, we will teach you how to")
# [{'generated_text': 'In this course, we will teach you how to understand and use ' 'data flow and data interchange when handling user data. We ' 'will be working with one or more of the most commonly used ' 'data flows — data flows of various types, as seen by the ' 'HTTP'}]

Using any model from the Hub in a pipeline

from transformers import pipeline

generator = pipeline("text-generation", model="distilgpt2")
generator(
    "In this course, we will teach you how to",
    max_length=30,
    num_return_sequences=2,
)
# [{'generated_text': 'In this course, we will teach you how to manipulate the world and ' 'move your mental and physical capabilities to your advantage.'}, {'generated_text': 'In this course, we will teach you how to become an expert and ' 'practice realtime, and with a hands on experience on both real ' 'time and real'}]

Mask filling

from transformers import pipeline

unmasker = pipeline("fill-mask")
unmasker("This course will teach you all about <mask> models.", top_k=2)
# [{'sequence': 'This course will teach you all about mathematical models.', 'score': 0.19619831442832947, 'token': 30412, 'token_str': ' mathematical'}, {'sequence': 'This course will teach you all about computational models.', 'score': 0.04052725434303284, 'token': 38163, 'token_str': ' computational'}]

Named entity recognition

from transformers import pipeline

ner = pipeline("ner", grouped_entities=True)
ner("My name is Sylvain and I work at Hugging Face in Brooklyn.")
# [{'entity_group': 'PER', 'score': 0.99816, 'word': 'Sylvain', 'start': 11, 'end': 18}, {'entity_group': 'ORG', 'score': 0.97960, 'word': 'Hugging Face', 'start': 33, 'end': 45}, {'entity_group': 'LOC', 'score': 0.99321, 'word': 'Brooklyn', 'start': 49, 'end': 57} ]

Question answering

from transformers import pipeline

question_answerer = pipeline("question-answering")
question_answerer(
    question="Where do I work?",
    context="My name is Sylvain and I work at Hugging Face in Brooklyn",
)
# {'score': 0.6385916471481323, 'start': 33, 'end': 45, 'answer': 'Hugging Face'}

Summarization

from transformers import pipeline

summarizer = pipeline("summarization")
summarizer(
    """
    America has changed dramatically during recent years. Not only has the number of 
    graduates in traditional engineering disciplines such as mechanical, civil, 
    electrical, chemical, and aeronautical engineering declined, but in most of 
    the premier American universities engineering curricula now concentrate on 
    and encourage largely the study of engineering science. As a result, there 
    are declining offerings in engineering subjects dealing with infrastructure, 
    the environment, and related issues, and greater concentration on high 
    technology subjects, largely supporting increasingly complex scientific 
    developments. While the latter is important, it should not be at the expense 
    of more traditional engineering.

    Rapidly developing economies such as China and India, as well as other 
    industrial countries in Europe and Asia, continue to encourage and advance 
    the teaching of engineering. Both China and India, respectively, graduate 
    six and eight times as many traditional engineers as does the United States. 
    Other industrial countries at minimum maintain their output, while America 
    suffers an increasingly serious decline in the number of engineering graduates 
    and a lack of well-educated engineers.
"""
)
# [{'summary_text': ' America has changed dramatically during recent years . The ' 'number of engineering graduates in the U.S. has declined in ' 'traditional engineering disciplines such as mechanical, civil ' ', electrical, chemical, and aeronautical engineering . Rapidly ' 'developing economies such as China and India, as well as other ' 'industrial countries in Europe and Asia, continue to encourage ' 'and advance engineering .'}]

Translation

from transformers import pipeline

translator = pipeline("translation", model="Helsinki-NLP/opus-mt-fr-en")
translator("Ce cours est produit par Hugging Face.")
# [{'translation_text': 'This course is produced by Hugging Face.'}]

How do Transformers work?

A bit of Transformer history

 Transformer architecture 2017, June A brief chronology of Transformers models.

Transformers are language models

  • causal language modelin: 預測n個word Example of causal language modeling in which the next word from a sentence is predicted.
  • masked language modeling: 預測空格Example of masked language modeling in which a masked word from a sentence is predicted.

Transformers are big models

Number of parameters of recent Transformers models The carbon footprint of a large language model.

Transfer Learning

  • pre training vs Fine-tuning The pretraining of a language model is costly in both time and money.

The fine-tuning of a language model is cheaper than pretraining in both time and money.

General architecture

Introduction

Architecture of a Transformers models

Attention layers

The original architecture

Architecture of a Transformers models

Architectures vs. checkpoints

  • Architecture: skeleton
  • Checkpoints: weights
  • Model: This is an umbrella term that isn’t as precise as “architecture” or “checkpoint”: it can mean both. This course will specify architecture or checkpoint when it matters to reduce ambiguity.

Encoder models

Decoder models

Sequence-to-sequence models sequence-to-sequence-models

Bias and limitations

  • 可能有偏見,因為他從網路資料訓練

Summary

ModelExamplesTasks
EncoderALBERT, BERT, DistilBERT, ELECTRA, RoBERTaSentence classification, named entity recognition, extractive question answering
DecoderCTRL, GPT, GPT-2, Transformer XLText generation
Encoder-decoderBART, T5, Marian, mBARTSummarization, translation, generative question answering

using transformers

inside pipline

The full NLP pipeline: tokenization of text, conversion to IDs, and inference through the Transformer model and the model head.

models

可以使用 save pretrain

model = BertModel(config)
model.save_pretrained("directory_on_my_computer")
import torch

model_inputs = torch.tensor(encoded_sequences)
output = model(model_inputs)

Tokenizer

word based

  • 有可能遇到沒在資料庫的會unknown

Character-based

Subword tokenization

  • BERT 用這

encoding

Handling multiple sequences

Putting it all together

Fine tuning a pre-trained model

(skip)

Sharing

(skip)

Datasets Lib

(skip)

Tokenizer lib

(skip)

Main NLP tasks

Token Classification

  • Named entity recognition (NER): 找相近
  • Part-of-speech tagging (POS): 分詞性
  • Chunking: 分段(?)

Fine Tuning masked L M

Translation

Summarization

Causal Language Modeling

  • 從前一個字預測下一個

Question answering

Building and sharing using Gradio

Ref

  • https://huggingface.co/learn/nlp-course/chapter0/1