0% found this document useful (0 votes)
12 views

Main

Uploaded by

jainarankit
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

Main

Uploaded by

jainarankit
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 49

MACHINE LEARNING

Through Practise

G. V. V. Sharma
Copyright ©2023 by G. V. V. Sharma.

https://creativecommons.org/licenses/by-sa/3.0/
and
https://www.gnu.org/licenses/fdl-1.3.en.html
Contents

Introduction iii

1 Least Squares 1

2 PT-100 15

2.1 Training Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

2.2 Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

2.3 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

2.4 Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

3 K-Means Method 21

3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

3.2 Fitting a Gaussian Curve . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

3.3 K-Means Clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

3.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

4 Beacon Tracking 27

4.1 Components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

4.2 Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

4.3 Working . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

4.3.1 Underlying Principles . . . . . . . . . . . . . . . . . . . . . . . . . 28


4.3.2 Algorithm Description . . . . . . . . . . . . . . . . . . . . . . . . . 29

4.4 Observations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

5 TinyML 31

5.1 Gesture Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

5.1.1 Component . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

5.1.2 Training Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

5.1.3 Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

5.1.4 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

5.2 Gesture Controlled Seven Segment . . . . . . . . . . . . . . . . . . . . . . . . 33

5.2.1 Commponents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

5.2.2 Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

5.2.3 Connections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

5.2.4 Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

5.3 Gesture Controlled UGV . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

5.3.1 Components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

5.3.2 Training Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

5.3.3 Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

5.3.4 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

A Three Dimensions 41

i
Introduction

This book introduces machine learning through simple examples

iii
iv
Chapter 1

Least Squares

1.0.1 Find the shortest distance between the lines




r = (î + 2ĵ + k̂) + λ(î − ĵ + k̂) and


r = 2î − ĵ − k̂ + µ(2î + ĵ + 2k̂)

1.0.2 Find the shortest distance between the lines


x+1 y+1 z+1 x−3 y−5 z−7
7 = −6 = 1 and 1 = −2 = 1 Solution: The given lines can be written as

   
−1
  7
   
x= −1
 
 + λ1 −6
  (1.1)
   
−1 1
   
3 1
   
x= 5 + λ2 −2
   (1.2)
   
7 1
       
−1
  3
    7 1
       
x1 = 
−1 ,
 x2 = 
5 , m1 = −6 , m2 = −2
     (1.3)
       
−1 7 1 1

1
We first check whether the given lines are skew. The lines

x = x1 + λ1 m1 , x = x2 + λ2 m2 (1.4)

intersect if

Mλ = x2 − x1 (1.5)
 
M ≜ m1 m2 (1.6)
 
 λ1 
λ≜  (1.7)
−λ2

(1.8)

Here we have,
   
 7 1  4
   
−6 −2 x2 − x1 = 6
M= (1.9)
  
   
1 1 8

We check whether the equation (1.10) has a solution

  

7 1 4
   
−6 −2 λ = 6 (1.10)
   
   
1 1 8

2
the augmented matrix is given by,
 
7 1 4
  R2 ←R2 + 67 R1
← −−−−−−−→ (1.11)
−6 −2 6
 R3 ←R3 − 17 R1


1 1 8
 
7 1 4
 
 
 
 
  R3 ←R3 + 34 R2
0 − 8 66  ←−−−−−− −→ (1.12)
 7 7 
 
 
 
 
6 52
0 7 −7
 
2 3 1
 
 
 
 
 
0 − 7 1  (1.13)
 2 2 
 
 
 
 
5
0 0 − 14

The rank of the matrix is 3. So the given lines are skew. The closest points on two

skew lines defined by (1.4) are given by

M⊤ Mλ = M⊤ (x2 − x1 ) (1.14)
   
 7 1 4
  
7 −6 1 7 −6 1

   
=⇒   −6 −2λ =  (1.15)
  
 6

1 −2 1   1 −2 1  
1 1 8
   
86 20 0
=⇒  λ =   (1.16)
20 6 0

3
The augmented matrix of the above equation (1.16) is given by,

 
86 20 0
 
86 20 0 R2 ←R2 − 43 1
10
R1   R1 ← 86 (R1 − 430
29
R2 )
  ←−−−−−−−−→   ←−−−−− −−−−−−→ (1.17)
R2 ← 43
  R
20 6 0   58 2
0 58
43 0
 
1 0 0
 



 (1.18)
 
0 1 0

yielding
 
 
 λ1  0
 =  (1.19)
−λ2 0

The closest points A on line l1 and B on line l2 are given by,



−1
 
 
A = x1 + λ1 m1 = 
−1
 (1.20)
 
−1
 
3
 
B = x2 + λ2 m2 = 
5
 (1.21)
 
7

4
The minimum distance between the lines is given by
 
4
  √
∥B − A∥ = 
6 = 2 29
 (1.22)
 
8

Figure 1.1:

5
1.0.3 Find the shortest distance between the lines whose vector equations are
   
1 1
   
x=
2 + λ1 −3
   (1.23)
   
3 2

and
   
4 2
   
x=
5 + λ2 3
   (1.24)
   
6 1

Solution: In this case,


       
1 4 1 2
       
x1 = 
2
 x2 = 
5
 m1 = 
−3
 m2 = 
3
 (1.25)
       
3 6 2 1

6
To check whether (A.3) has a solution in λ, we use the augmented matrix.

   
 1 2 3  1 2 3
  R2 ←R2 +3R1  
−3 3 3 ←−−−−−−−→ 0 9 12 (1.26)
   
   
2 1 3 2 1 3
 
1 2 3
R3 ←R3 −2R1  
←−−−−−−−→  0 9 12 
 (1.27)
 
0 −3 −3
 
1 2 3
R ←3R +R2  
←−3−−−−3−−→ 0
 9 12
 (1.28)
 
0 0 3

Clearly, the rank of this matrix is 3, and therefore, the lines are skew. Substituting

7
from (1.25) in (A.6) and forming the augmented matrix,

   
 14 −5 0  R1 ←R1 +R2  9 9 18
  ←−−−−−−→   (1.29)
−5 14 18 −5 14 18
 
R1 ← 9  1 1 2
R1

←−−−−→   (1.30)
−5 14 18
 
R ←R +5R1 1 1 2
←−2−−−2−−−→   (1.31)
0 19 28
 
R1 ←19R1 −R2 19 0 10
←− −−−−−−→   (1.32)
0 19 28
R
R1 ← 191
 
10
1 0
R
R2 ← 92 19 
←−−−−→   (1.33)
28
0 1 19
 
1 10
=⇒ λ = (1.34)
19 28
 

Hence, using (A.5) and substituing into (A.7) and (A.8),

   
29 20
 
1   1  
A= 8 B= 11 (1.35)
19  

 19  


77 86

Thus, the required distance is

p
92 + 32 + (−9)2 3
∥B − A∥ = =√ (1.36)
19 19

The situation is depicted in Fig. 1.2.

8
Figure 1.2: AB is the required shortest distance.

1.0.4

1.0.5 Find the shortest distance between the lines l1 and l2 whose vector equations are


r = î + ĵ + λ(2î − ĵ + k̂) and →

r = 2î + ĵ − k̂ + µ(3î − 5ĵ + 2k̂). Solution: The givne

9
lines can be written in vector form as
       
1 2 2 3
       
x= 1
 
 + λ1 −1 , x =  1  + λ2 −5
      (1.37)
       
0 1 −1 2
       
1 2 2 3
       
=⇒ x1 = 
1 ,

 1  , m1 = −1 , m2 = −5
x2 =       (1.38)
       
0 −1 1 2

We first check whether the given lines are skew. The lines

x = x1 + λ1 m1 , x = x2 + λ2 m2 (1.39)

intersect if

Mλ = x2 − x1 (1.40)
 
M ≜ m1 m2 (1.41)
 
 λ1 
λ≜  (1.42)
−λ2

(1.43)

Here we have,
   
 2 3   1
   
−1 −5 , x2 − x1
M= =
0 (1.44)
 
   
1 2 −1

10
We check whether the equation (1.45) has a solution

   
 2 3  1
   
−1 −5 λ =  0  (1.45)
   
   
1 2 −1

The augmented matrix is given by,

 
2 3 1
   
 
2 3 1  
 
  R2 ←R2 + 12 R1  
−1 −5 0  ←−−−−−−1−→ 

0 − 72 1
2
 (1.46)
 R3 ←R3 − 2 R1 
 
 
1 2 −1
 
 
 
1
0 2 − 32
 
2 3 1
 
 
 
 
R ←R +7R2  
←−3−−−3−−−→ 0
 − 72 1 
2 
(1.47)
 
 
 
 
0 0 −10

The rank of the matrix is 3. So the given lines are skew. The closest points on two

11
skew lines defined by (1.39) are given by

M⊤ Mλ = M⊤ (x2 − x1 ) (1.48)
   
 2 3 1
  
2 −1 1  2 −1 1  


=⇒  −1 −5 λ =  (1.49)

 0
  
3 −5 2   3 −5 2  
1 2 −1
   
 6 13 1
=⇒  λ =   (1.50)
13 38 1

The augmented matrix of the above equation (1.50) is given by,

 
6 13 1 
 
 6 13 1 R2 ←R2 − 136
R 1  
  ←−−−−−−−−→  

 (1.51)
13 38 1  
0 59
6
7
−6
 
150
6 0 59 
R1 ←R1 − 78 R 2  
←−−−−−−59 −−→  

 (1.52)
 
0 59
6 − 76

So, we get
 
  25
59
 λ1  
 

 =


 (1.53)
−λ2  
7
− 59

12
The closest points A on line l1 and B on line l2 are given by,
 
109
1  
A = x1 + λ 1 m 1 =  34  (1.54)
59  


25
 
 139 
1  
B = x2 + λ 2 m 2 =  24  (1.55)
59 



−45

The minimum distance between the lines is given by,


 
 30 
1 
−10 = √10

∥B − A∥ = (1.56)
59 
 
 59
−70

See Fig. 1.3.

13
Figure 1.3:

14
Chapter 2

PT-100

This chapter illustrates the use of the least squares method in finding the voltage across

the PT-100 RTD (Resistance Temperature Detector) as a function of temperature.

2.1. Training Data


The training data gathered by the PT-100 to train the Arduino is shown in Table 2.1.

Temperature (◦ C) Voltage (V)


66 1.85
27 1.76
2 1.66
23 1.72
56 1.82
34 1.76
33 1.75
31 1.74

Table 2.1: Training data.

The C++ source file

pt100/codes/data.cpp

15
was used along with platformio to drive the Arduino. The effective schematic circuit diagram

is shown in Figure 2.1.

10 Ω

5V P Ω

Figure 2.1: Schematic Circuit Diagram to Measure the Output of PT-100 (P ).

2.2. Model
For the PT-100, we use the Callendar-Van Dusen equation

V (T ) = V (0) 1 + AT + BT 2

(2.1)

=⇒ c = n⊤ x (2.2)

where
   
1 1
   
A , x =  T 
c = V (T ), n = V (0)     (2.3)
   
B T2

For multiple points, (2.2) becomes

X⊤ n = C (2.4)

16
where
 
1 1 ... 1 
 
X= T
 1 T 2 . . . Tn
 (2.5)
 
2 2
T1 T2 . . . Tn 2

 
V (T1 )
 
 
 V (T2 ) 
C= .  (2.6)
 
 . 
 . 
 
V (Tn )

and n is the unknown.

2.3. Solution
We approximate n by using the least sqaures method. The Python code codes/lsq.py

solves for n.

The calculated value of n is


 
 1.6547 
 
n =  3.199 × 10
 −3  (2.7)

 
−3.9599 × 10−6

The approximation is shown in Fig. 2.2.

Thus, the approximate model is given by

V (T ) = 1.6547 + 3.199 × 10−3 T




− 3.9599 × 10−6 T 2

(2.8)

17
Figure 2.2: Training the model.

Notice in (2.8) that the coefficient of T 2 is negative, and hence the governing function

is strictly concave. Hence, we cannot use gradient descent methods to solve this problem.

2.4. Validation
The validation dataset is shown in Table 2.2. The results of the validation are shown in Fig.

2.3.

18
Temperature (◦ C) Voltage (V)
4 1.67
25 1.73
61 1.83
35 1.77

Table 2.2: Validation data.

Figure 2.3: Validating the model.

19
20
Chapter 3

K-Means Method

3.1. Introduction
We test the utility of the K-means algorithm in assigning grades as compared to estimating

the grades using the standard normal distribution. We consider the scores of N = 94

students who have taken a course in the Indian Institute of Technology, Hyderabad (IITH)

as our dataset.

3.2. Fitting a Gaussian Curve


Since N is not very large, given the scores of each student xi , 1 ≤ i ≤ N , we can compute

the population mean and population variance as

µ = E [x] (3.1)
h i
σ 2 = E (x − µ)2 (3.2)

21
We assume that the scores x ∼ N µ, σ 2 . Thus, we compute the Z-scores as


x−µ
Z= (3.3)
σ

The grades are assigned as per Table 3.1.

Interval Grade
(−∞, -3] F
(-3, -2] D
(-2, 1] C
(-1, 0] B-
(0, 1] B
(1, 2] A-
(2, 3] A
(3,∞) A+

Table 3.1: Grading Scheme.

The Python code

grading/codes/grades norm.py

takes the given input population dataset

grading/codes/marks.xlsx

and assigns grades appropriately. The grades are output to

grading/codes/grades norm.xlsx

22
3.3. K-Means Clustering
K-Means clustering is an unsupervised classification model, which attempts to cluster un-

labeled data in order to gain more structure from it.

We frame this requirement as an optimization problem. For a set of data points {xi }N
i=1

and means {µi }K


i=1 , we define for 1 ≤ n ≤ N, 1 ≤ k ≤ K,



1
 arg minj ∥xn − µj ∥ = k
rnk ≜ (3.4)

0
 otherwise

Thus, we need to find points µk minimizing the cost function

N X
X K
J≜ rnk ∥xn − µk ∥2 (3.5)
n=1 k=1

Clearly, (3.5) is a quadratic function of µk . Differentiating with respect to µk and setting

the derivative to zero, we get

N
X
2µk rnk (xn − µk ) = 0 (3.6)
n=1
PN
n=1 rnk xn Xrk
=⇒ µk = P = (3.7)
N
n=1 rnk
1⊤ rk

where

 
X≜ x 1 x2 . . . xn (3.8)
 ⊤
rk ≜ r1k r2k . . . rnk (3.9)
 ⊤
1≜ 1 1 ... 1 (3.10)

23
From (3.7), we see that the optimum is attained when µk is set to the expectation of the

xn with respect to rnk .

Thus, the K-means algorithm is essentially an EM algorithm, where each iteration con-

sists of two steps.

1. E Step: Calculate the K-expected values

PN
n=1 rnk xn
µ˜k ≜ PN
(3.11)
n=1 rnk

for 1 ≤ k ≤ K.

2. M Step: Assign µk ← µ˜k for 1 ≤ k ≤ K.

3.4. Results
The grade distribution using each method is shown in Fig. 3.1 and Fig. 3.2. Based on the

results, we can make the following observations:

1. Grading using the Gaussian distribution would lead to many students failing the

course, while this is not the case using the K-means algorithm.

2. Using the Gaussian distribution is quite unfair, since there could be students with

quite similar marks but with a difference in grade, just because they lie on either side

of a predefined boundary.

3. The K-means algorithm allows for better decision boundaries, depending on how

skewed the performance of the students is, accordingly to the difficulty of the course.

4. Unlike the Gaussian distribution, the K-means algorithm can be used for a fairer

24
Figure 3.1: Grade distribution using a Gaussian curve.

assignment of the grades, no matter how skewed the performance of students in a

course is.

25
Figure 3.2: Grade distribution using the K-means algorithm.

26
Chapter 4

Beacon Tracking

This chpater demonstrates the use of machine learning in beacon tracking using an un-

manned ground vehicle (UGV) and a WiFi-enabled microcontroller such as the Vaman-

ESP32.

4.1. Components
Component Value Quantity
USB-OTG 1
Vaman LC 1
USB-UART 1
UGV Kit 1
Battery 12 V 1
Android Phone 1

Table 4.1: Components Required for Beacon Tracking Using the Vaman-ESP32.

4.2. Procedure
1. Make the connections as per the wiring diagram in Fig. 4.1.

27
2. Connect the Vaman-ESP32 board to your Android Phone using USB-OTG.

3. Generate the firmware by entering the following commands.

cd ugv−beacon/codes

pio run

4. Go to ArduinoDroid and select

Actions → Upload → Upload Precompiled

and choose the firmware file at

ugv−beacon/codes/.pio/build/firmware.hex

5. Now put the phone at a reasonable distance from the UGV with no obstacles in the

way and then turn on the hotspot. The UGV should travel towards the phone and

stop near it.

4.3. Working

4.3.1. Underlying Principles


1. To estimate (radial) distance to beacon, we use its signal strength. For WiFi, this is

the Received Signal Strength Indicator (RSSI).

2. The RSSI (R dBm) at distance d metres is given by

R(d) = R(1) − 10 log10 (d) (4.1)

28
Figure 4.1: Wiring Diagram for Beacon Tracking.

3. Clearly, R(d) is a convex function. Hence, we can use gradient descent.

4.3.2. Algorithm Description


Please note that this is a generic description of the algorithm employed. Refer to

ugv−beacon/codes/src/main.cpp

for a more verbose implementation.

1. If the UGV is close enough to the beacon, terminate.

2. Take measurements at various points on a straight line.

29
3. Based on these measurements, decide the next move of the UGV, and recurse till the

UGV is close enough to the beacon.

4.4. Observations
1. The UGV eventually converges close to the beacon (here, the hotspot).

2. However, if there are a lot of nearby obstacles, the UGV may not converge close to

the location of the beacon. It may either get physically blocked by the beacon or the

signal interference may be too high.

30
Chapter 5

TinyML

5.1. Gesture Recognition


This section illustrates a use case of TinyML in gesture detection. A video demonstration

of the experiment described below is present in the README of the repository.

5.1.1. Component
Component Value Quantity
USB-OTG 1
Vaman LC 1
USB-UART 1
Android Phone 1

Table 5.1: Components Required for Gesture Detection Using TinyML.

5.1.2. Training Data


1. Download the Sensor Logger app from Google Play Store.

2. In the settings on the app, set the sampling frequency up to 100 Hz, and turn off

31
logging uncalibrated data. Record from the accelerometer and gyroscope only.

3. Press Start Recording, and perform the gestures 100 times to gather the data.

4. Press Stop Recording when you are done and export the recordings as a zip file.

5. Unzip and rename the CSV files appropriately in

tiny−ml/gesture/data

6. In the same directory, run the following commands:

gcc −O2 format.c

./a.out

7. Two more CSV files train.csv and test.csv will be created in the same directory.

5.1.3. Model
1. Run the Python script

tiny−ml/gesture/codes/bin class.py

2. Tweak the neural network parameters in this file if the accuracy is not satisfactory.

3. The neural network model will be present as a bytestream in

tiny−ml/gesture/codes/client/src/gesture model.h

5.1.4. Implementation
1. Find the IP address of your phone by using the ifconfig command.

32
2. In the Sensor Logger app, set the HTTP Push URL to

http://<IP>:5000/gesture

in the app settings. Enable the HTTP Push feature.

3. Start the server with the following commands.

cd tiny−ml/gesture/codes/server

flask run −−host <IP>

4. Compile and upload the platformio project in

tiny−ml/gesture/codes/client

to the Vaman-ESP32 using USB-UART.

5. Attach a serial monitor to the terminal with the following command:

pio device monitor −b 115200

6. Start recording in the Sensor Logger app and perform the gestures. Verify whether

the model works as intended.

7. Implement a decade counter controlled via gesture detection.

5.2. Gesture Controlled Seven Segment


This section demonstrates controlling seven segment display with the Gyroscope sensor

present on the mobile.

33
5.2.1. Commponents

Component Value Quantity


Resistor 220 Ohm 1
Vaman LC 1
Seven Segment Display 1
USB-UART 1
Jumper Wires F-M 10
Bread board 1

Table 5.3: Components

5.2.2. Setup
1. Install the apk on Mobile, this application is required for accessing the sensor on

mobile and send the sensor data to the vaman board

cd tiny−ml/gesture−sevenseg

Click on the apk to Install and give necessary permissions

2. Execute the Following code

cd tiny−ml/gesture−sevenseg/codes/sevenseg

pio run

pio run −t upload

5.2.3. Connections
1. Pin diagram of seven segment is shown in Fig. 5.1

34
g f COM a b

f b

e c

dot

e d COM c dot

Figure 5.1: Seven Segment pins

2. Connect the seven segment to the vaman as shown in Table 5.5

3. After uploading the code connect the vaman ip address to mobile application

35
VAMAN 32 32 25 26 27 14 12
Display a b c d e f g

Table 5.5: Connections

ifconfig

nmap −sn 192.168.x.0/24

4. Consider 192.168.x.0 as example and replace the ip address with the one displayed for

ifconfig address with ”x”.

5. Type IP address displayed for vaman board on the mobile application

6. Then click on start on the mobile application

5.2.4. Execution
1. Now tilt the mobile for the change in the seven segment display

2. When mobile is tilted Forward 1 is displayed on the seven segment display

3. When mobile is tilted Left 2 is displayed on the seven segment display

4. When mobile is tilted Right 3 is displayed on the seven segment display

5. When mobile is tilted Back 4 is displayed on the seven segment display

6. When there is no movement then 0 is displayed which indicates stop

7. Repeat the above process for controlling the Toy car using the Gyroscope sensor

cd tiny−ml/gesture−sevenseg/codes/toycar

36
pio run

pio run −t upload

5.3. Gesture Controlled UGV


This section illustrates a use case of TinyML in controlling an unmanned ground vehicle

(UGV). A video demonstration of the experiment described below is linked in the README

of the repository.

5.3.1. Components
Component Value Quantity
USB-OTG 1
Vaman LC 1
USB-UART 1
UGV Kit 1
Battery 12 V 1
Android Phone 1

Table 5.6: Components Required for Controlling the UGV Using TinyML.

5.3.2. Training Data


1. Download the Sensor Logger app from Google Play Store.

2. In the settings on the app, set the sampling frequency to maximum, and turn off

logging uncalibrated data. Record from the device orientation sensor only.

37
3. Press Start Recording, and perform the gestures 100 times to gather the data.

4. Press Stop Recording when you are done and export the recordings as a zip file.

5. Unzip and rename the CSV files appropriately in

tiny−ml/ugv/data

6. In the same directory, run the following commands:

gcc −O2 format.c

./a.out

7. Two more CSV files train.csv and test.csv will be created in the same directory.

5.3.3. Model
1. Run the Python script

tiny−ml/ugv/codes/multi class.py

2. Tweak the neural network parameters in this file if the accuracy is not satisfactory.

3. The neural network model will be present as a bytestream in

tiny−ml/ugv/codes/client/src/gesture model.h

5.3.4. Implementation
1. Find the IP address of your phone by using the ifconfig command.

38
2. In the Sensor Logger app, set the HTTP Push URL to

http://<IP>:5000/ugv

in the app settings. Enable the HTTP Push feature.

3. Run the server with the following commands.

cd tiny−ml/ugv/codes/server

flask run −−host <IP>

4. Compile and upload the platformio project in

tiny−ml/ugv/codes/client

to the Vaman-ESP32 using USB-UART.

5. Attach a serial monitor to the terminal with the following command:

pio device monitor −b 115200

6. Start recording in the Sensor Logger app and perform the gestures. Verify whether

the model works as intended.

7. Detach the USB-UART from the Vaman-ESP32, and make connections to the inputs

of the L293 Motor Driver onboard the assembled UGV as per Table 5.7.

39
Vaman-ESP32 L293
14 EN A
15 EN B
16 MA 1
17 MA 2
18 MB 1
19 MB 2
5V VCC
GND GND

Table 5.7: Connections Between Vaman-ESP32 and L293 Motor Driver.

40
Appendix A

Three Dimensions

A.1. The lines

x = x1 + λ1 m1 (A.1)

x = x2 + λ2 m2 (A.2)

intersect if

Mλ = x2 − x1 (A.3)

where

 
M≜ m1 m2 (A.4)
 
 λ1 
λ≜  (A.5)
−λ2

A.2. The closest points on two skew lines are given by

M⊤ Mλ = M⊤ (x2 − x1 ) (A.6)

41
Solution: For the lines defined in (A.1) and (A.2), Suppose the closest points on both

lines are

A = x1 + λ 1 m 1 (A.7)

B = x2 + λ 2 m 2 (A.8)

Then, AB is perpendicular to both lines, hence

m1 ⊤ (A − B) = 0 (A.9)

m2 ⊤ (A − B) = 0 (A.10)

=⇒ M⊤ (A − B) = O (A.11)

Using (A.7) and (A.8) in (A.11),

M⊤ (x1 − x2 + Mλ) = 0 (A.12)

(A.13)

yielding A.6.

42

You might also like