0 CHECKOUT

Crowdsourcing for Speech Processing. Applications to Data Collection, Transcription and Assessment

  • ID: 2330206
  • April 2013
  • 356 Pages
  • John Wiley and Sons Ltd
1 of 4

The concept of crowdsourcing is based on the observation that if a crowd of non–experts is asked an opinion, the aggregation of their individual opinions will be very close to the true value. Tasks such as collecting speech, labelling it, assessing systems and carrying out studies on the speech data are natural candidates for crowdsourcing. This book is a detailed and hands–on comprehensive reference for those who want to use crowdsourcing for speech applications. From the reader who has already used crowdsourcing and wants to refine their methods to the novice who has never used this technique before; this book will provide a practical introduction to crowdsourcing as a means of rapidly processing speech data with contributions from leading researchers in the field.

- Informs readers about how to collect and label speech using crowdsourcing; how to assess speech applications and run perception studies using crowdsourcing.
- Explains to readers about how to choose crowdsourcing platforms.
- Considers the ethical and legal implications of performing crowdsourcing for speech processing.
- Includes numerous real–life examples of how to implement crowdsourcing for various types of speech processing.
- Offers several options for each type of task enabling readers to choose which option best fits their individual needs.
- Provides an extensive overview of the literature on crowdsourcing for speech processing.

Note: Product cover images may vary from those shown
2 of 4

Contents

List of Contributors xiii

Preface xv

1 An Overview 1

Maxine Eskénazi

1.1 Origins of Crowdsourcing 2

1.2 Operational Definition of Crowdsourcing 3

1.3 Functional Definition of Crowdsourcing 3

1.4 Some Issues 4

1.5 Some Terminology 6

1.6 Acknowledgments 6

References 6

2 The Basics 8

Maxine Eskénazi

2.1 An Overview of the Literature on Crowdsourcing for Speech Processing 8

2.2 Alternative Solutions 14

2.3 Some Ready–Made Platforms for Crowdsourcing 15

2.4 Making Task Creation Easier 17

2.5 Getting Down to Brass Tacks 17

2.6 Quality Control 29

2.7 Judging the Quality of the Literature 32

2.8 Some Quick Tips 33

2.9 Acknowledgments 33

References 33

Further reading 35

3 Collecting Speech from Crowds 37

Ian McGraw

3.1 A Short History of Speech Collection 38

3.2 Technology for Web–Based Audio Collection 43

3.3 Example: WAMI Recorder 49

3.4 Example: The WAMI Server 52

3.5 Example: Speech Collection on Amazon Mechanical Turk 59

3.6 Using the Platform Purely for Payment 65

3.7 Advanced Methods of Crowdsourced Audio Collection 67

3.8 Summary 69

3.9 Acknowledgments 69

References 70

4 Crowdsourcing for Speech Transcription 72

Gabriel Parent

4.1 Introduction 72

4.2 Transcribing Speech 73

4.3 Preparing the Data 80

4.4 Setting Up the Task 83

4.5 Submitting the Open Call 91

4.6 Quality Control 95

4.7 Conclusion 102

4.8 Acknowledgments 103

References 103

5 How to Control and Utilize Crowd–Collected Speech 106

Ian McGraw and Joseph Polifroni

5.1 Read Speech 107

5.2 Multimodal Dialog Interactions 111

5.3 Games for Speech Collection 120

5.4 Quizlet 121

5.5 Voice Race 123

5.6 Voice Scatter 129

5.7 Summary 135

5.8 Acknowledgments 135

References 136

6 Crowdsourcing in Speech Perception 137

Martin Cooke, Jon Barker, and Maria Luisa Garcia Lecumberri

6.1 Introduction 137

6.2 Previous Use of Crowdsourcing in Speech and Hearing 138

6.3 Challenges 140

6.4 Tasks 145

6.5 BigListen: A Case Study in the Use of Crowdsourcing to Identify Words in Noise 149

6.6 Issues for Further Exploration 167

6.7 Conclusions 169

References 169

7 Crowdsourced Assessment of Speech Synthesis 173

Sabine Buchholz, Javier Latorre, and Kayoko Yanagisawa

7.1 Introduction 173

7.2 Human Assessment of TTS 174

7.3 Crowdsourcing for TTS: What Worked and What Did Not 177

7.4 Related Work: Detecting and Preventing Spamming 193

7.5 Our Experiences: Detecting and Preventing Spamming 195

7.6 Conclusions and Discussion 212

References 214

8 Crowdsourcing for Spoken Dialog System Evaluation 217

Zhaojun Yang, Gina–Anne Levow, and Helen Meng

8.1 Introduction 217

8.2 Prior Work on Crowdsourcing: Dialog and Speech Assessment 220

8.3 Prior Work in SDS Evaluation 221

8.4 Experimental Corpus and Automatic Dialog Classification 225

8.5 Collecting User Judgments on Spoken Dialogs with Crowdsourcing 226

8.6 Collected Data and Analysis 230

8.7 Conclusions and Future Work 238

8.8 Acknowledgments 238

References 239

9 Interfaces for Crowdsourcing Platforms 241

Christoph Draxler

9.1 Introduction 241

9.2 Technology 242

9.3 Crowdsourcing Platforms 253

9.4 Interfaces to Crowdsourcing Platforms 261

9.5 Summary 278

References 278

10 Crowdsourcing for Industrial Spoken Dialog Systems 280

David Suendermann and Roberto Pieraccini

10.1 Introduction 280

10.2 Architecture 283

10.3 Transcription 287

10.4 Semantic Annotation 290

10.5 Subjective Evaluation of Spoken Dialog Systems 296

10.6 Conclusion 300

References 300

11 Economic and Ethical Background of Crowdsourcing for Speech 303

Gilles Adda, Joseph J. Mariani, Laurent Besacier, and Hadrien Gelas

11.1 Introduction 303

11.2 The Crowdsourcing Fauna 304

11.3 Economic and Ethical Issues 307

11.4 Under–Resourced Languages: A Case Study 316

11.5 Toward Ethically Produced Language Resources 322

11.6 Conclusion 330

Disclaimer 331

References 331

Index 335

Note: Product cover images may vary from those shown
3 of 4

Maxine Eskenazi
Gina–Anne Levow
Helen Meng
Gabriel Parent
David Suendermann

Note: Product cover images may vary from those shown
4 of 4
Note: Product cover images may vary from those shown

PURCHASING OPTIONS

HAVE A QUESTION?

EMAIL US VIEW FAQs

RELATED PRODUCTS from Db

Our Clients

  • Intel Corporation
  • Autodesk, Inc.
  • Hewlett-Packard Company
  • Ahlstrom Corporation
  • SAP SE
  • 3M Company
  • Samsung Electronics Co., Ltd.