Skip to content

활동

AI storytelling friend

중급 | MakeCode | LED, 가속도 센서, 스피커 | AI 이해하기, Speaking & listening, 머신 러닝, 입력/출력

A great way to use what children already know about narrative and character development to support new learning about AI.

프로젝트 단계별 가이드

1단계: 이해하기

How does it work?

In this project you’ll train a machine learning (ML) model to recognise different ways that you move a soft toy with a BBC micro:bit attached to it. You’ll choose movements to help you retell a story.

You will then combine the machine learning model with a Microsoft MakeCode program, and the micro:bit will play sounds or show images when these movements are detected.

기계 학습이란?

기계 학습(ML)은 컴퓨터가 데이터 기반으로 학습하고 판단하는 일종의 인공지능(AI)입니다.

예를 들어, ML 모델은 사용자가 micro:bit를 다양한 방식으로 움직일 때 다양한 '행동'을 인식하도록 결정하는 도움을 주기 위해 학습됩니다.

제작 방법

AI systems need humans to design, build, test, and use them.

First, you will need to decide if you want to use the movements we have provided, or choose different movements that work for your own story. Our story is about a bear called Lucy who wants to be a gymnast, so we have chosen movements that fit this theme: jumping, rollling, and sleeping.

You will then collect data to train the ML model, test it, improve it and combine it with computer code to make a storytelling device that uses AI, using a micro:bit and the micro:bit CreateAI website.

We’ve also included some evaluation questions to compare this AI project with one that just uses normal algorithms and code.

2단계: 만들기

준비물

데이터 샘플 수집

When you open the project in micro:bit CreateAI, you’ll see we’ve given you data samples for some suggested movements for your soft toy (jumping, rolling, and sleeping):

You can add your own soft toy movement samples using the micro:bit's movement sensor, its accelerometer.

In micro:bit CreateAI, click the ‘Connect’ button to connect your data collection micro:bit and follow the instructions.

Attach the data collection micro:bit to your soft toy. It’s important that all the samples are recorded with the same placement of the micro:bit on the soft toy. If you want to use the ready-made samples in the project, attach the micro:bit around the neck of the soft toy facing forward, as shown in the picture below. If you want to change how the toy wears the micro:bit, replace all the provided data samples with your own.

A soft toy bear wearing a micro:bit around its neck.

Add your own movement data samples for jumping, rolling and sleeping. Click on each action in turn, then click ‘record’ to take a short sample of each.

실수한 경우 원하지 않는 샘플은 삭제할 수 있습니다. micro:bit의 B 버튼을 눌러 기록을 시작할 수도 있습니다.

Examine the data samples: do all the ‘jumping’ samples look similar? Do all the ‘rolling’ samples look different to ‘jumping’ and ‘sleeping’?

모델 훈련 및 테스트

'모델 훈련(Train model)' 버튼을 클릭하여 모델을 훈련한 후 테스트하세요.

Bounce your soft toy up and down to see if ‘jumping’ is shown as the estimated action. Put the soft toy down to sleep and see if ‘sleeping’ is estimated. Test if ‘rolling’ is detected when you turn the soft toy head over heels.

Ask someone else to move the toy and see if it works as well for them.

모델 개선

대부분의 모델은 데이터가 많을수록 더 개선될 수 있습니다. 모델의 사용자 행동 인식 능력을 개선해야 한다면 ‘← 데이터 샘플 편집(Edit data samples)’을 클릭하세요.

You can clean your data set by deleting any samples which you think don’t fit (because they look completely different from other samples for the same action). 또한 자신과 다른 사람의 샘플을 추가하여 모델을 개선할 수도 있습니다.

Think about all the positions your soft toy might ‘sleep’ in, you’ll notice the x, y, and z lines change their order depending on the angle of the micro:bit.

모델을 한 번 더 훈련하고 다시 테스트합니다.

모델과 코드를 micro:bit에 넣습니다.

micro:bit CreateAI의 ‘MakeCode에서 편집(Edit in MakeCode)’을 클릭하면 MakeCode 편집기에서 프로젝트 코드를 볼 수 있습니다.

다른 micro:bit MakeCode 프로젝트와 마찬가지로 코드를 수정할 수도 있고, 그대로 사용해 볼 수도 있습니다.

USB 데이터 케이블로 micro:bit를 연결하고 MakeCode 화면의 '다운로드(Download)' 버튼을 클릭한 후 안내에 따라 AI 모델과 코드 블록을 micro:bit로 전송하세요.

Unplug the micro:bit, attach a battery pack, position it on your soft toy and test it.

코드 블록 작동 방식

The 'on ML… start' blocks are triggered when the ML model decides your toy has started one of the actions it has been trained to detect. Different sounds play and different icons are shown on the micro:bit's LED display depending on the action it has estimated your soft toy is doing.

The 'on ML… stop' blocks are triggered when the ML model decides your toy has finished an action. Code inside each block clears the screen and stops all sounds.

An extra block, ‘on ML unknown start’, clears the screen if the model is not sure which action your toy is doing.

평가

Compare this project with the Sensory toy project that also uses the accelerometer sensor to react to different movements but which does not use machine learning or any other kind of AI.

  • What kinds of movements, or actions, can the Sensory toy project react to?
  • What is different about the kinds of actions the AI storytelling friend project can react to? Are they simpler or more complex?
  • What other actions might you want to train the ML model to recognise?
  • Which project is better at helping you tell your story?

3단계: 확장하기

  • Explore different movements with your AI storytelling friend, and change the actions to suit a well-known folk story or fairytale.
  • Use the ‘show LEDs’ block in place of the ‘show icons’ block to customise the icons to match your story. You could plan your customised icons using the LED planning sheets.
  • If you have a class mascot, use CreateAI to train the mascot to respond to movements that give feedback to students e.g. give praise or reward class points.