431 lines
8.2 KiB
Markdown
431 lines
8.2 KiB
Markdown
# Notes on 1st project
|
|
|
|
The given data at [iui-datalrelease1-sose2021-readonly/\*](/opt/iui-datarelease1-sose2021) represents sensor data from a pen.
|
|
|
|
| Data | MinVal | MaxVal | Description |
|
|
| ------ |:------:|:------:| ----------------------------------------------- |
|
|
| Millis | - | - | Timestamp from tablet (Unix time) |
|
|
| Acc1 X | - | 32768 | Front/Tip accelerometer (Direction: Left/Right) |
|
|
| Acc1 Y | - | 32768 | Front/Tip accelerometer (Direction: Up/Down) |
|
|
| Acc1 Z | - | 32768 | Front/Tip accelerometer (Direction: Back/Front) |
|
|
| Acc2 X | - | 8192 | Back accelerometer (Direction: Left/Right) |
|
|
| Acc2 Y | - | 8192 | Back accelerometer (Direction: Up/Down) |
|
|
| Acc2 Z | - | 8192 | Back accelerometer (Direction: Back/Front) |
|
|
| Gyro X | - | 32768 | Gyroscope sensor |
|
|
| Gyro Y | - | 32768 | Gyroscope sensor |
|
|
| Gyro Z | - | 32768 | Gyroscope sensor |
|
|
| Mag X | - | 8192 | Magnetometer |
|
|
| Mag Y | - | 8192 | Magnetometer |
|
|
| Mag Z | - | 8192 | Magnetometer |
|
|
| Force | - | 4096 | Force applied |
|
|
| Time | - | - | Time from start of "recording" |
|
|
|
|
There were 100 participants.
|
|
|
|
The folder-structure is as follows:\
|
|
`/opt/iui-datarelease1-sose2021/{P}/split_letters_csv/{N}{A}.csv`
|
|
|
|
| Variable | Description |
|
|
| -------- | ------------------------------------- |
|
|
| P | The ID of the participant |
|
|
| N | The N-th letter the participant wrote |
|
|
| A | The letter that was written |
|
|
|
|
Each participants folder contains a `calibration.txt`, which contains the calibration data of the pen for the participant.
|
|
|
|
Sensor data was recorded at 100hz (100 recordings/s => 1 recording/ms).
|
|
|
|
## Preprocessing
|
|
|
|
### General
|
|
|
|
Since information has different scale (i.e. Acc1: [-32768;32768] and Acc2 [-8192;8192]) the information has to be valued differently based on their importance.
|
|
|
|
### Millis
|
|
|
|
- Could be used for identifying each data entry -> needs to be normalized to the first entry of the data set to see the comlete timeline of the data
|
|
|
|
### Acc1/Acc2/Gyro/Mag
|
|
|
|
todo
|
|
|
|
### Force
|
|
|
|
- Sometimes sensor data was recorded even when there is no action -> we need to determine the area of interest
|
|
- maybe sliding window, where window avg has to be certain threshold
|
|
- general threshold aproach (filter out data below threshold)
|
|
- more ideas welcome
|
|
|
|
- Data could be normalized by each users relative strength or data entry
|
|
|
|
### Time
|
|
|
|
## Neural Network
|
|
This segment are notes dedicated to the neural network itself.
|
|
|
|
### Ideas
|
|
|
|
- Don't use batch normalization but normalize by maxval of sensor
|
|
|
|
### Results
|
|
|
|
#### Test1 (72.99%)
|
|
|
|
```python
|
|
thresh = 100
|
|
leeway = 5
|
|
epoch = 20
|
|
|
|
model = Sequential()
|
|
|
|
model.add(Flatten(input_shape=X_filter[0].shape))
|
|
|
|
model.add(BatchNormalization())
|
|
|
|
model.add(Dense(1560, activation='relu'))
|
|
|
|
model.add(Dense(750, activation='relu'))
|
|
|
|
model.add(Dense(300, activation='relu'))
|
|
|
|
model.add(Dense(156, activation='relu'))
|
|
|
|
model.add(Dense(26, activation='softmax'))
|
|
```
|
|
|
|
```
|
|
Evaluate on test data
|
|
82/82 [==============================] - 0s 3ms/step - loss: 1.4249 - acc: 0.7299
|
|
```
|
|
|
|
#### Test2 (75.21%)
|
|
|
|
```python
|
|
thresh = 75
|
|
leeway = 3
|
|
epoch = 20
|
|
|
|
model = Sequential()
|
|
|
|
model.add(Flatten(input_shape=X_filter[0].shape))
|
|
|
|
model.add(BatchNormalization())
|
|
|
|
model.add(Dense(1560, activation='relu'))
|
|
|
|
model.add(Dense(750, activation='relu'))
|
|
|
|
model.add(Dense(300, activation='relu'))
|
|
|
|
model.add(Dense(156, activation='relu'))
|
|
|
|
model.add(Dense(26, activation='softmax'))
|
|
```
|
|
|
|
```
|
|
Evaluate on test data
|
|
82/82 [==============================] - 0s 2ms/step - loss: 1.5145 - acc: 0.7521
|
|
```
|
|
|
|
#### Test3 (36.96%)
|
|
|
|
```python
|
|
thresh = 10
|
|
leeway = 3
|
|
epoch = 20
|
|
|
|
model = Sequential()
|
|
|
|
model.add(Flatten(input_shape=X_filter[0].shape))
|
|
|
|
model.add(BatchNormalization())
|
|
|
|
model.add(Dense(1560, activation='relu'))
|
|
|
|
model.add(Dense(750, activation='relu'))
|
|
|
|
model.add(Dense(300, activation='relu'))
|
|
|
|
model.add(Dense(156, activation='relu'))
|
|
|
|
model.add(Dense(26, activation='softmax'))
|
|
```
|
|
|
|
```
|
|
Evaluate on test data
|
|
82/82 [==============================] - 0s 2ms/step - loss: 2.5231 - acc: 0.3696
|
|
```
|
|
|
|
#### Test4 (69.66%)
|
|
|
|
```python
|
|
thresh = 50
|
|
leeway = 3
|
|
epoch = 20
|
|
|
|
model = Sequential()
|
|
|
|
model.add(Flatten(input_shape=X_filter[0].shape))
|
|
|
|
model.add(BatchNormalization())
|
|
|
|
model.add(Dense(1560, activation='relu'))
|
|
|
|
model.add(Dense(750, activation='relu'))
|
|
|
|
model.add(Dense(300, activation='relu'))
|
|
|
|
model.add(Dense(156, activation='relu'))
|
|
|
|
model.add(Dense(26, activation='softmax'))
|
|
```
|
|
|
|
```
|
|
Evaluate on test data
|
|
82/82 [==============================] - 0s 2ms/step - loss: 1.6965 - acc: 0.6966
|
|
```
|
|
|
|
|
|
#### Test5 (73.13%)
|
|
|
|
```python
|
|
thresh = 60
|
|
leeway = 3
|
|
epoch = 20
|
|
|
|
model = Sequential()
|
|
|
|
model.add(Flatten(input_shape=X_filter[0].shape))
|
|
|
|
model.add(BatchNormalization())
|
|
|
|
model.add(Dense(1560, activation='relu'))
|
|
|
|
model.add(Dense(750, activation='relu'))
|
|
|
|
model.add(Dense(300, activation='relu'))
|
|
|
|
model.add(Dense(156, activation='relu'))
|
|
|
|
model.add(Dense(26, activation='softmax'))
|
|
```
|
|
|
|
```
|
|
Evaluate on test data
|
|
82/82 [==============================] - 0s 2ms/step - loss: 1.4886 - acc: 0.7313
|
|
```
|
|
|
|
#### Test6 (75.68%)
|
|
|
|
```python
|
|
thresh = 68
|
|
leeway = 3
|
|
epoch = 20
|
|
|
|
model = Sequential()
|
|
|
|
model.add(Flatten(input_shape=X_filter[0].shape))
|
|
|
|
model.add(BatchNormalization())
|
|
|
|
model.add(Dense(1560, activation='relu'))
|
|
|
|
model.add(Dense(750, activation='relu'))
|
|
|
|
model.add(Dense(300, activation='relu'))
|
|
|
|
model.add(Dense(156, activation='relu'))
|
|
|
|
model.add(Dense(26, activation='softmax'))
|
|
```
|
|
|
|
```
|
|
Evaluate on test data
|
|
82/82 [==============================] - 0s 2ms/step - loss: 1.4227 - acc: 0.7568
|
|
```
|
|
|
|
#### Test7 (76.07%)
|
|
|
|
```python
|
|
thresh = 68
|
|
leeway = 3
|
|
epoch = 30
|
|
|
|
model = Sequential()
|
|
|
|
model.add(Flatten(input_shape=X_filter[0].shape))
|
|
|
|
model.add(BatchNormalization())
|
|
|
|
model.add(Dense(1560, activation='relu'))
|
|
|
|
model.add(Dense(750, activation='relu'))
|
|
|
|
model.add(Dense(300, activation='relu'))
|
|
|
|
model.add(Dense(156, activation='relu'))
|
|
|
|
model.add(Dense(26, activation='softmax'))
|
|
```
|
|
|
|
```
|
|
Evaluate on test data
|
|
82/82 [==============================] - 0s 2ms/step - loss: 1.5863 - acc: 0.7607
|
|
```
|
|
|
|
#### Test8 (75.49%)
|
|
|
|
```python
|
|
THRESH = 70
|
|
LEEWAY = 3
|
|
EPOCH = 30
|
|
|
|
DENSE_COUNT = 5
|
|
DENSE_NEURONS = 1800
|
|
```
|
|
|
|
```
|
|
Evaluate on test data
|
|
21/21 [==============================] - 0s 2ms/step - loss: 1.5598 - acc: 0.7684
|
|
```
|
|
|
|
#### Test9 (78.15%)
|
|
|
|
```python
|
|
THRESH = 70
|
|
LEEWAY = 1
|
|
EPOCH = 30
|
|
|
|
DENSE_COUNT = 5
|
|
DENSE_NEURONS = 1800
|
|
```
|
|
|
|
```
|
|
Evaluate on test data
|
|
21/21 [==============================] - 0s 2ms/step - loss: 1.4677 - acc: 0.7815
|
|
```
|
|
|
|
#### Test10 (77.64%)
|
|
|
|
```python
|
|
THRESH = 70
|
|
LEEWAY = 0
|
|
EPOCH = 30
|
|
|
|
DENSE_COUNT = 5
|
|
DENSE_NEURONS = 1900
|
|
```
|
|
|
|
|
|
#### Test11 (77.41%)
|
|
|
|
```python
|
|
THRESH = 70
|
|
LEEWAY = 0
|
|
EPOCH = 30
|
|
|
|
DENSE_COUNT = 5
|
|
DENSE_NEURONS = 1800
|
|
```
|
|
|
|
#### Test12 (73.90%)
|
|
|
|
```python
|
|
THRESH = 70
|
|
LEEWAY = 0
|
|
EPOCH = 30
|
|
|
|
DENSE_COUNT = 5
|
|
DENSE_NEURONS = 1800
|
|
|
|
DENSE_COUNT = 5
|
|
DENSE_NEURONS = 900
|
|
```
|
|
|
|
#### Test13 (78.89%)
|
|
|
|
NORM->FLAT->DENSE
|
|
```python
|
|
THRESH = 70
|
|
LEEWAY = 0
|
|
EPOCH = 30
|
|
|
|
DENSE_COUNT = 5
|
|
DENSE_NEURONS = 1900
|
|
```
|
|
|
|
#### Test14 (79.04%)
|
|
|
|
NORM->FLAT->DENSE
|
|
```python
|
|
THRESH = 70
|
|
LEEWAY = 0
|
|
EPOCH = 30
|
|
|
|
DENSE_COUNT = 5
|
|
DENSE_NEURONS = 1800
|
|
```
|
|
|
|
#### Test15 (79.00%)
|
|
|
|
NORM->FLAT->DENSE
|
|
```python
|
|
THRESH = 70
|
|
LEEWAY = 0
|
|
EPOCH = 30
|
|
|
|
DENSE_COUNT = 5
|
|
DENSE_NEURONS = 1700
|
|
```
|
|
|
|
#### Test16 (78.69%)
|
|
|
|
NORM->FLAT->DENSE
|
|
```python
|
|
THRESH = 70
|
|
LEEWAY = 0
|
|
EPOCH = 30
|
|
|
|
DENSE_COUNT = 5
|
|
DENSE_NEURONS = 1600
|
|
```
|
|
|
|
#### Test17 (78.57%)
|
|
|
|
NORM->FLAT->DENSE
|
|
```python
|
|
THRESH = 70
|
|
LEEWAY = 0
|
|
EPOCH = 30
|
|
|
|
DENSE_COUNT = 4
|
|
DENSE_NEURONS = 1800
|
|
```
|
|
|
|
#### Test18 (78.12%)
|
|
|
|
NORM->FLAT->DENSE
|
|
```python
|
|
THRESH = 70
|
|
LEEWAY = 0
|
|
EPOCH = 30
|
|
|
|
DENSE_COUNT = 6
|
|
DENSE_NEURONS = 1800
|
|
```
|
|
|
|
#### Test19 (79.13%)
|
|
|
|
NORM->FLAT->DENSE
|
|
```python
|
|
THRESH = 70
|
|
LEEWAY = 0
|
|
EPOCH = 30
|
|
|
|
DENSE_COUNT = 3
|
|
DENSE_NEURONS = 1800
|
|
|
|
DENSE2_COUNT = 2
|
|
DENSE2_NEURONS = 1200
|
|
```
|