Skip to content

Commit

Permalink
Updated readme files
Browse files Browse the repository at this point in the history
Updated instructions in readme files and PD patch
  • Loading branch information
Indeterminado committed Feb 20, 2021
1 parent 8d872db commit 8e2ba59
Show file tree
Hide file tree
Showing 4 changed files with 113 additions and 70 deletions.
55 changes: 50 additions & 5 deletions Experiment resources/README.md
Original file line number Diff line number Diff line change
@@ -1,34 +1,79 @@

# NemmiProtoUno Experiment Resources

In this section you can find all resources need to implement your own instance of the experiment.



For specific description of the structure of the experiment, I would urge you to read the paper describing it (see project root), as you will need to be aware of the rationale behind it to understand its structure. After you do, these files and their objective should make sense.



In terms of hardware, you will need:



- [ ] one handheld device (smartphone or tablet) running Android

- [ ] the provided app installed in said device, with appropriate permissions already granted

- [ ] one computer with Pure Data installed

- [ ] good quality speakers

- [ ] one device to record video and audio (e.g. smartphone, camera), ideally mounted on a tripod



## Specific instructions

### Mobile app

The app should be running on the device you hand participants, so download the source code provided, compile it and make sure it runs on the device. The device should be connected to the same network as the computer running the PD patch.

### Pure Data control patch

This is how you will control the flow of the experiment. From this patch you will be triggering the sounds for the participant to listen and control the mobile app. The only thing you need to change is the IP to which the patch will send messages. After the device with the app is connected to the network, check its IP address and change it in the PD patch before connecting. You can send a *soundoff* message from it to check if communication is working.

### Full experiment script

Just as the name says, this is the full script you should follow for the experiment. Much like a movie script, all moments are detailed, and you should follow it - of course specific discourse can be adapted.

### Participant details template
This is an SPSS we provide, pre-formatted for input just as we used, and ready to run the syntax file in [Results and Analysis](https://github.com/Indeterminado/NemmiProtoUnoPublicExperiment/tree/main/Results%20and%20Analysis "Results and Analysis"). The easiest way would be to first write down participant answers on an Excel Spreadsheet or Word Document and then convert it to this format.

Important things to know:
This is an SPSS we provide, pre-formatted for input just as we used, and ready to run the syntax file in [Results and Analysis](https://github.com/Indeterminado/NemmiProtoUnoPublicExperiment/tree/main/Results%20and%20Analysis "Results and Analysis"). The easiest way would be to first write down participant answers on an Excel Spreadsheet or Word Document and then convert it to this format.

Important things to know:

1. Gesture list is based on our own instance of the experiment and may not cover all gestures you encounter. In that case you should edit the *Values* list for all corresponding mapping variables (*Variable View* tab has descriptions for that - check for the *uncategorized mappings* variables)

1. Gesture list is based on our own instance of the experiment and may not cover all gestures you encounter. In that case you should edit the *Values* list for all corresponding mapping variables (*Variable View* tab has descriptions for that - check for the *uncategorized mappings* variables)
2. Same goes for participant rationale. In the paper we describe the categorization we implemented based on the answers we collected, but you are free to expand on this (check *Variable View* tab for *Rationale* variables and update list)
2. Same goes for participant rationale. In the paper we describe the categorization we implemented based on the answers we collected, but you are free to expand on this (check *Variable View* tab for *Rationale* variables and update list)



*Variable View* descriptions should explain each variable's objective.
For each participant you will have to fill in:

- ID (this is just a numeric identifier)
- Profile (musician VS non-musician)
- Age
- Gender
- Attributed order for first three stimuli (read the paper)
- Details for each of the stimuli
- stimuli perception (X_PAR_PHASE)
- gesture mapping (X_MAP_PHASE_L1)
- gesture categorization (X_MAP_PHASE_L2)
- mapping rationale (X_RSN_PHASE)

X represents the stimulus id (a-e), PAR represents the musical parameter of that stimulus (PITch, DURation, AMPlitude) - L1 is the uncategorized gesture, L2 is the broad category of the gesture (read the paper), RSN means reason, and PHASE represents the phase number of the data (read the... ok, you get it)
Stimulus D is the only one with a different variable structure, with the gesture variables following a format of (D_PARMAP_PHASE_L1). PAR follows the same logic as other stimuli.



## Putting it all together
If you go over the Full experiment script after reading the paper, you should be set to run the experiment yourself. If any doubts persist, feel free to drop me a message.

If you go over the Full experiment script after reading the paper, you should be set to run the experiment yourself. If any doubts persist, or if explanation is too confusing, feel free to drop me a message.



Alexandre Clément
68 changes: 33 additions & 35 deletions Experiment resources/control_patch.pd
Original file line number Diff line number Diff line change
Expand Up @@ -20,15 +20,11 @@
#X obj 47 262 spigot 1, f 30;
#X obj 123 312 spigot 0, f 23;
#X msg 42 359 disconnect;
#X obj 374 100 sel 77;
#X obj 419 139 sel 78;
#X obj 374 157 bng 15 250 50 0 empty empty empty 17 7 0 10 -262144
#X obj 333 365 sel 78;
#X obj 333 403 bng 15 250 50 0 empty empty empty 17 7 0 10 -262144
-1 -1;
#X obj 419 177 bng 15 250 50 0 empty empty empty 17 7 0 10 -262144
-1 -1;
#X text 373 78 new test soundoff;
#X text 421 122 soundoff;
#X msg 152 391 send soundoff;
#X text 335 348 soundoff;
#X msg 210 402 send soundoff;
#X obj 550 19 bng 15 250 50 0 empty empty empty 17 7 0 10 -262144 -1
-1;
#X obj 550 80 t b b;
Expand Down Expand Up @@ -59,6 +55,10 @@
#X msg 697 261 open sound_stimuli/event_4.wav;
#X msg 735 309 open sound_stimuli/event_5.wav;
#X msg 16 208 connect 192.168.1.71 64000;
#X text 747 63 Stimuli trigger;
#X text 106 420 send soundoff to confirm communication \, button to
the right should flicker if all OK;
#X text 20 189 change to appropriate IP;
#X connect 1 0 2 0;
#X connect 2 0 3 0;
#X connect 3 0 7 0;
Expand All @@ -79,37 +79,35 @@
#X connect 14 0 12 0;
#X connect 15 0 9 0;
#X connect 15 0 11 0;
#X connect 15 0 41 0;
#X connect 15 0 38 0;
#X connect 16 0 13 0;
#X connect 16 0 0 0;
#X connect 17 0 0 0;
#X connect 17 0 14 0;
#X connect 18 0 0 0;
#X connect 19 0 21 0;
#X connect 19 1 20 0;
#X connect 20 0 22 0;
#X connect 25 0 0 0;
#X connect 19 0 20 0;
#X connect 22 0 0 0;
#X connect 23 0 24 0;
#X connect 24 0 25 0;
#X connect 24 1 42 0;
#X connect 25 0 26 0;
#X connect 26 0 27 0;
#X connect 27 0 28 0;
#X connect 27 1 45 0;
#X connect 26 0 27 1;
#X connect 28 0 29 0;
#X connect 29 0 30 0;
#X connect 29 0 30 1;
#X connect 31 0 32 0;
#X connect 32 0 28 0;
#X connect 32 1 46 0;
#X connect 33 0 34 0;
#X connect 34 0 28 0;
#X connect 34 1 47 0;
#X connect 35 0 36 0;
#X connect 36 0 28 0;
#X connect 36 1 48 0;
#X connect 37 0 38 0;
#X connect 38 0 28 0;
#X connect 38 1 49 0;
#X connect 45 0 29 0;
#X connect 46 0 29 0;
#X connect 47 0 29 0;
#X connect 48 0 29 0;
#X connect 49 0 29 0;
#X connect 50 0 0 0;
#X connect 29 0 25 0;
#X connect 29 1 43 0;
#X connect 30 0 31 0;
#X connect 31 0 25 0;
#X connect 31 1 44 0;
#X connect 32 0 33 0;
#X connect 33 0 25 0;
#X connect 33 1 45 0;
#X connect 34 0 35 0;
#X connect 35 0 25 0;
#X connect 35 1 46 0;
#X connect 42 0 26 0;
#X connect 43 0 26 0;
#X connect 44 0 26 0;
#X connect 45 0 26 0;
#X connect 46 0 26 0;
#X connect 47 0 0 0;
30 changes: 27 additions & 3 deletions Mobile app source code/README.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,35 @@

# NemmiProtoUno Android App

In this section you can find the Android Studio project for the application used in the study of how users map mobile handheld device gestures to musical parameters.
It is a rather simple app, uses no external libraries, and should compile without problems. It implements functionalities for touch and accelerometer reading, as well as network communication (over UDP) and sensor data logging. Just open the **NemiProtoUno** folder with Android Studio and you should be good to go.

It is a rather simple app, uses no external libraries, and should compile without problems. It implements functionalities for touch and accelerometer reading, as well as network communication (over UDP) and sensor data logging. Just open the **NemiProtoUno** folder with Android Studio and you should be good to go.



All code and assets are released under the [GNU AFFERO GENERAL PUBLIC LICENSE](https://opensource.org/licenses/AGPL-3.0), if you use it in any way please respect it. And do let me know, I'm always interested in knowing how it might have been used!

**DISCLAIMER:** although I did test this app on many devices, the nature of the Android ecosystem is such that it IS possible that a given device is unable to run this app. No guarantee is made regarding that, but feel free to adapt and correct it.


## DISCLAIMER
Although I did test this app on many devices, the nature of the Android ecosystem is such that it IS possible that a given device is unable to run this app. No guarantee is made regarding that, but feel free to adapt and correct it.



For details on how to use this application in the context of the experiment, check section [Experiment resources](https://github.com/Indeterminado/NemmiProtoUnoPublicExperiment/tree/main/Experiment%20resources "Experiment resources") of this repository.

### Other info
If you want to access logged data the application creates, you can find log files under the *Android/com.feup.nemiprotouno/files/Nemmi_ProtoUno* directory on your device.
Logs follow this naming pattern: NemmiProtoUnoLog_DATE_TIME_MILLISECONDS.txt

Events are logged with a timestamp. Event code and values as follows:

For details on how to use this application in the context of the experiment, check section [Experiment resources](https://github.com/Indeterminado/NemmiProtoUnoPublicExperiment/tree/main/Experiment%20resources "Experiment resources") of this repository.
- Test ID (when you send the *test start* message for each stimuli this value increments)
- accel X Y Z (accelerometer values on each axis)
- touchStart ID X Y (thouch with ID started at X,Y coordinates)
- move ID X Y (touch with ID moved to X, Y coordinates)
- touchEnd ID X Y (touch with ID ended at X, Y coordinates)
- shake (user shaked the device)


Alexandre Clément
30 changes: 3 additions & 27 deletions Results and Analysis/README.md
Original file line number Diff line number Diff line change
@@ -1,28 +1,4 @@
# NemmiProtoUno Experiment Protocol
What is this?
# NemmiProtoUno Results and Analysis
In this section you can find all the results of the analysis we ran during our experiment, as well as the SPSS syntax file so you can run the same tests.

This repository compiles all needed resources to run the experimental protocol detailed and presented in the article *"Musical Control Gestures in Mobile Handheld Devices: design guidelines informed by daily user experience"* (link to come as soon as it is published).

This experimental protocol aims to allow for a systematized study of how users map mobile handheld device gestures to musical parameters.

It is also part of the larger project *Nemmi*.

## What is Nemmi?

*Nemmi* stands for *Non-Emulative Mobile Musical Instrument* and is the PhD project of Alexandre Clément, Digital Media PhD candidate at [FEUP](https://www.fe.up.pt).
Mobile musical instruments are abundant in the mobile app stores, but most either attempt to emulate traditional musical instruments or approach their interaction and interface design with very specific methods. There is no such thing as a uniformized way of interacting and controlling musical instruments on handheld devices, especially one that is designed with the particularities of those devices in mind.
*Nemmi* is an attempt to exactly that: design a mobile musical instrument that is anchored in user-experience and developed with handheld device operation and specificities in mind.

### What about NemmiProtoUno - what's that?
While developing this project, several prototypes were created, at different stages, in order to test and evaluate different things. With the prototypes came the need to distinguish them, and so the nomenclature *NemmiProtoX* was adopted (no particular reason for that).
*NemmiProtoUno* is the prototype developed in order to study how users would appropriate the device and represent musical parameters through common control gestures.

## What is in this repository, then?
Everything needed to reproduce the experimental test in order to verify or expand on the results. Or even expand towards testing with other parameters. We tried to understand how users instinctively mapped *Note Onset*, *Note Pitch*, *Note Duration* and *Note Amplitude*. The provided resources allow anyone to reproduce this testing, but can easily.
- In the [Results and Analysis](https://github.com/Indeterminado/NemmiProtoUnoPublicExperiment/tree/main/Results%20and%20Analysis "Results and Analysis") section you can find the results for the experiment we ran.
- In the [Experiment resources](https://github.com/Indeterminado/NemmiProtoUnoPublicExperiment/tree/main/Experiment%20resources "Experiment resources") you can find all necessary resources to run the same experimental protocol, as well as instructions on doing so.
- In [Mobile app source code/NemiProtoUno](https://github.com/Indeterminado/NemmiProtoUnoPublicExperiment/tree/main/Mobile%20app%20source%20code/NemiProtoUno "This path skips through empty directories") you will find the Android Studio project for the app needed to run the experiment.

And that's it. Each section has a specific readme file with details regarding its contents. Anything else just feel free to shoot me a message.

Alexandre Clément
Alexandre Clément

0 comments on commit 8e2ba59

Please sign in to comment.