1-Transformer_models-9-End-of-chapter_quiz

中英文对照学习,效果更佳!
原课程链接:https://huggingface.co/course/chapter1/10?fw=pt

End-of-chapter quiz

章末测验

Ask a Question

问一个问题

This chapter covered a lot of ground! Don’t worry if you didn’t grasp all the details; the next chapters will help you understand how things work under the hood.

这一章涵盖了很多领域!如果您没有掌握所有细节,请不要担心;下一章将帮助您了解事情是如何在幕后运行的。

First, though, let’s test what you learned in this chapter!

不过,首先让我们测试一下您在本章中学到了什么!

  1. Explore the Hub and look for the roberta-large-mnli checkpoint. What task does it perform?

Summarization

探索Hub,寻找`roberta-Large-mnli‘检查点。它执行什么任务?摘要

Text classification

文本分类

Text generation

文本生成

  1. What will the following code return?
1
2
3
4
from transformers import pipeline

ner = pipeline("ner", grouped_entities=True)
ner("My name is Sylvain and I work at Hugging Face in Brooklyn.")

It will return classification scores for this sentence, with labels “positive” or “negative”.

下面的代码将返回什么?它将返回该句子的分类分数,标签为“积极”或“消极”。

It will return a generated text completing this sentence.

它将返回完成此句子的生成文本。

It will return the words representing persons, organizations or locations.

它将返回代表个人、组织或位置的单词。

  1. What should replace … in this code sample?
1
2
3
4
from transformers import pipeline

filler = pipeline("fill-mask", model="bert-base-cased")
result = filler("...")

This has been waiting for you.

什么应该取代…在这个代码样例中?这一直在等着您。

This [MASK] has been waiting for you.

这个[面具]一直在等你。

This man has been waiting for you.

这个人一直在等你。

  1. Why will this code fail?
1
2
3
4
from transformers import pipeline

classifier = pipeline("zero-shot-classification")
result = classifier("This is a course about the Transformers library")

This pipeline requires that labels be given to classify this text.

为什么这个代码会失败?这个管道需要给出标签来对这个文本进行分类。

This pipeline requires several sentences, not just one.

这条管道需要几句话,而不是一句话。

The 🤗 Transformers library is broken, as usual.

像往常一样,🤗Transformer的库被破坏了。

This pipeline requires longer inputs; this one is too short.

这条管道需要更长的输入;这条管道太短了。

  1. What does “transfer learning” mean?

Transferring the knowledge of a pretrained model to a new model by training it on the same dataset.

转移学习是什么意思?通过在相同的数据集上训练,将预先训练好的模型的知识转移到新模型上。

Transferring the knowledge of a pretrained model to a new model by initializing the second model with the first model’s weights.

通过用第一模型的权重来初始化第二模型,将预先训练的模型的知识转移到新模型。

Transferring the knowledge of a pretrained model to a new model by building the second model with the same architecture as the first model.

通过构建具有与第一模型相同的体系结构的第二模型,将预先训练的模型的知识转移到新模型。

  1. True or false? A language model usually does not need labels for its pretraining.

True

对还是错?语言模型通常不需要标签来进行预训练。

False

错误

  1. Select the sentence that best describes the terms “model”, “architecture”, and “weights”.

If a model is a building, its architecture is the blueprint and the weights are the people living inside.

选择最能描述“模型”、“建筑”和“重量”这三个术语的句子。如果一个模型是一座建筑,它的建筑就是蓝图,而重量就是住在里面的人。

An architecture is a map to build a model and its weights are the cities represented on the map.

建筑是一张构建模型的地图,它的权重是地图上表示的城市。

An architecture is a succession of mathematical functions to build a model and its weights are those functions parameters.

体系结构是一系列用于构建模型的数学函数,其权重是这些函数参数。

您会使用以下哪种类型的模型来完成生成文本的提示?编码器模型

  1. Which of these types of models would you use for completing prompts with generated text?

An encoder model

一种解码器模型

A decoder model

一种序列到序列模型

A sequence-to-sequence model

您会使用哪种类型的模型来汇总文本?编码器模型

  1. Which of those types of models would you use for summarizing texts?

An encoder model

一种解码器模型

A decoder model

一种序列到序列模型

A sequence-to-sequence model

您会使用哪种类型的模型来根据特定标签对文本输入进行分类?编码器模型

  1. Which of these types of models would you use for classifying text inputs according to certain labels?

An encoder model

一种解码器模型

A decoder model

一种序列到序列模型

A sequence-to-sequence model

在一个模型中观察到的偏差可能有什么来源?该模型是预先训练好的模型的微调版本,它从这个模型中获取它的偏差。

  1. What possible source can the bias observed in a model have?

The model is a fine-tuned version of a pretrained model and it picked up its bias from it.

对该模型进行训练的数据是有偏见的。

The data the model was trained on is biased.

该模型优化的指标是有偏见的。

The metric the model was optimizing for is biased.