iui-group-l-name-zensiert/1-first-project/tdt
Tuan-Dat Tran 39095c6697 Added really bad normalization 2021-06-10 17:18:36 +00:00
..
best_model7904 Added hyperparamteroptimization (run did not finish) and commit to add stuff for presentation 2021-06-08 20:19:18 +00:00
DataVisualization.ipynb Added code for NN 2021-06-07 17:58:49 +00:00
Hyperparameter.ipynb Added really bad normalization 2021-06-10 17:18:36 +00:00
NNwAll.ipynb Added really bad normalization 2021-06-10 17:18:36 +00:00
NeuralNetwork.ipynb Canceled hyperparameter at 3x2400/2x1200, added future plans to presentation 2021-06-09 08:12:50 +00:00
README.md Added temp results from README, added more hyperparameter runs and added temporal results to presentation 2021-06-08 21:50:31 +00:00
T_DataNormaization.ipynb Added code for NN 2021-06-07 17:58:49 +00:00
X.pickle Added code for NN 2021-06-07 17:58:49 +00:00
X_filter.pickle Added code for NN 2021-06-07 17:58:49 +00:00
all_forces.png Added hyperparamteroptimization (run did not finish) and commit to add stuff for presentation 2021-06-08 20:19:18 +00:00
results.csv Canceled hyperparameter at 3x2400/2x1200, added future plans to presentation 2021-06-09 08:12:50 +00:00
t_Data_Norm_wth_SW.ipynb Removed unnecessary saved models, fixed accidental file rename 2021-06-08 20:21:03 +00:00
y.pickle Added code for NN 2021-06-07 17:58:49 +00:00
y_filter.pickle Added code for NN 2021-06-07 17:58:49 +00:00

README.md

Notes on 1st project

The given data at iui-datalrelease1-sose2021-readonly/* represents sensor data from a pen.

Data MinVal MaxVal Description
Millis - - Timestamp from tablet (Unix time)
Acc1 X - 32768 Front/Tip accelerometer (Direction: Left/Right)
Acc1 Y - 32768 Front/Tip accelerometer (Direction: Up/Down)
Acc1 Z - 32768 Front/Tip accelerometer (Direction: Back/Front)
Acc2 X - 8192 Back accelerometer (Direction: Left/Right)
Acc2 Y - 8192 Back accelerometer (Direction: Up/Down)
Acc2 Z - 8192 Back accelerometer (Direction: Back/Front)
Gyro X - 32768 Gyroscope sensor
Gyro Y - 32768 Gyroscope sensor
Gyro Z - 32768 Gyroscope sensor
Mag X - 8192 Magnetometer
Mag Y - 8192 Magnetometer
Mag Z - 8192 Magnetometer
Force - 4096 Force applied
Time - - Time from start of "recording"

There were 100 participants.

The folder-structure is as follows:
/opt/iui-datarelease1-sose2021/{P}/split_letters_csv/{N}{A}.csv

Variable Description
P The ID of the participant
N The N-th letter the participant wrote
A The letter that was written

Each participants folder contains a calibration.txt, which contains the calibration data of the pen for the participant.

Sensor data was recorded at 100hz (100 recordings/s => 1 recording/ms).

Preprocessing

General

Since information has different scale (i.e. Acc1: [-32768;32768] and Acc2 [-8192;8192]) the information has to be valued differently based on their importance.

Millis

  • Could be used for identifying each data entry -> needs to be normalized to the first entry of the data set to see the comlete timeline of the data

Acc1/Acc2/Gyro/Mag

todo

Force

  • Sometimes sensor data was recorded even when there is no action -> we need to determine the area of interest

    • maybe sliding window, where window avg has to be certain threshold
    • general threshold aproach (filter out data below threshold)
    • more ideas welcome
  • Data could be normalized by each users relative strength or data entry

Time

Neural Network

This segment are notes dedicated to the neural network itself.

Ideas

  • Don't use batch normalization but normalize by maxval of sensor

Results

Test1 (72.99%)

thresh      = 100
leeway      = 5
epoch       = 20

model = Sequential()

model.add(Flatten(input_shape=X_filter[0].shape))

model.add(BatchNormalization())

model.add(Dense(1560, activation='relu'))

model.add(Dense(750, activation='relu'))

model.add(Dense(300, activation='relu'))

model.add(Dense(156, activation='relu'))

model.add(Dense(26, activation='softmax'))
Evaluate on test data
82/82 [==============================] - 0s 3ms/step - loss: 1.4249 - acc: 0.7299

Test2 (75.21%)

thresh      = 75
leeway      = 3
epoch       = 20

model = Sequential()

model.add(Flatten(input_shape=X_filter[0].shape))

model.add(BatchNormalization())

model.add(Dense(1560, activation='relu'))

model.add(Dense(750, activation='relu'))

model.add(Dense(300, activation='relu'))

model.add(Dense(156, activation='relu'))

model.add(Dense(26, activation='softmax'))
Evaluate on test data
82/82 [==============================] - 0s 2ms/step - loss: 1.5145 - acc: 0.7521

Test3 (36.96%)

thresh      = 10
leeway      = 3
epoch       = 20

model = Sequential()

model.add(Flatten(input_shape=X_filter[0].shape))

model.add(BatchNormalization())

model.add(Dense(1560, activation='relu'))

model.add(Dense(750, activation='relu'))

model.add(Dense(300, activation='relu'))

model.add(Dense(156, activation='relu'))

model.add(Dense(26, activation='softmax'))
Evaluate on test data
82/82 [==============================] - 0s 2ms/step - loss: 2.5231 - acc: 0.3696

Test4 (69.66%)

thresh      = 50
leeway      = 3
epoch       = 20

model = Sequential()

model.add(Flatten(input_shape=X_filter[0].shape))

model.add(BatchNormalization())

model.add(Dense(1560, activation='relu'))

model.add(Dense(750, activation='relu'))

model.add(Dense(300, activation='relu'))

model.add(Dense(156, activation='relu'))

model.add(Dense(26, activation='softmax'))
Evaluate on test data
82/82 [==============================] - 0s 2ms/step - loss: 1.6965 - acc: 0.6966

Test5 (73.13%)

thresh      = 60
leeway      = 3
epoch       = 20

model = Sequential()

model.add(Flatten(input_shape=X_filter[0].shape))

model.add(BatchNormalization())

model.add(Dense(1560, activation='relu'))

model.add(Dense(750, activation='relu'))

model.add(Dense(300, activation='relu'))

model.add(Dense(156, activation='relu'))

model.add(Dense(26, activation='softmax'))
Evaluate on test data
82/82 [==============================] - 0s 2ms/step - loss: 1.4886 - acc: 0.7313

Test6 (75.68%)

thresh      = 68
leeway      = 3
epoch       = 20

model = Sequential()

model.add(Flatten(input_shape=X_filter[0].shape))

model.add(BatchNormalization())

model.add(Dense(1560, activation='relu'))

model.add(Dense(750, activation='relu'))

model.add(Dense(300, activation='relu'))

model.add(Dense(156, activation='relu'))

model.add(Dense(26, activation='softmax'))
Evaluate on test data
82/82 [==============================] - 0s 2ms/step - loss: 1.4227 - acc: 0.7568

Test7 (76.07%)

thresh      = 68
leeway      = 3
epoch       = 30

model = Sequential()

model.add(Flatten(input_shape=X_filter[0].shape))

model.add(BatchNormalization())

model.add(Dense(1560, activation='relu'))

model.add(Dense(750, activation='relu'))

model.add(Dense(300, activation='relu'))

model.add(Dense(156, activation='relu'))

model.add(Dense(26, activation='softmax'))
Evaluate on test data
82/82 [==============================] - 0s 2ms/step - loss: 1.5863 - acc: 0.7607

Test8 (75.49%)

THRESH = 70
LEEWAY = 3
EPOCH = 30

DENSE_COUNT = 5
DENSE_NEURONS = 1800
Evaluate on test data
21/21 [==============================] - 0s 2ms/step - loss: 1.5598 - acc: 0.7684

Test9 (78.15%)

THRESH = 70
LEEWAY = 1
EPOCH = 30

DENSE_COUNT = 5
DENSE_NEURONS = 1800
Evaluate on test data
21/21 [==============================] - 0s 2ms/step - loss: 1.4677 - acc: 0.7815

Test10 (77.64%)

THRESH = 70
LEEWAY = 0
EPOCH = 30

DENSE_COUNT = 5
DENSE_NEURONS = 1900

Test11 (77.41%)

THRESH = 70
LEEWAY = 0
EPOCH = 30

DENSE_COUNT = 5
DENSE_NEURONS = 1800

Test12 (73.90%)

THRESH = 70
LEEWAY = 0
EPOCH = 30

DENSE_COUNT = 5
DENSE_NEURONS = 1800

DENSE_COUNT = 5
DENSE_NEURONS = 900

Test13 (78.89%)

NORM->FLAT->DENSE

THRESH = 70
LEEWAY = 0
EPOCH = 30

DENSE_COUNT = 5
DENSE_NEURONS = 1900

Test14 (79.04%)

NORM->FLAT->DENSE

THRESH = 70
LEEWAY = 0
EPOCH = 30

DENSE_COUNT = 5
DENSE_NEURONS = 1800

Test15 (79.00%)

NORM->FLAT->DENSE

THRESH = 70
LEEWAY = 0
EPOCH = 30

DENSE_COUNT = 5
DENSE_NEURONS = 1700

Test16 (78.69%)

NORM->FLAT->DENSE

THRESH = 70
LEEWAY = 0
EPOCH = 30

DENSE_COUNT = 5
DENSE_NEURONS = 1600

Test17 (78.57%)

NORM->FLAT->DENSE

THRESH = 70
LEEWAY = 0
EPOCH = 30

DENSE_COUNT = 4
DENSE_NEURONS = 1800

Test18 (78.12%)

NORM->FLAT->DENSE

THRESH = 70
LEEWAY = 0
EPOCH = 30

DENSE_COUNT = 6
DENSE_NEURONS = 1800

Test19 (79.13%)

NORM->FLAT->DENSE

THRESH = 70
LEEWAY = 0
EPOCH = 30

DENSE_COUNT = 3
DENSE_NEURONS = 1800

DENSE2_COUNT = 2
DENSE2_NEURONS = 1200