-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Updated instructions in readme files and PD patch
- Loading branch information
1 parent
8d872db
commit 8e2ba59
Showing
4 changed files
with
113 additions
and
70 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,34 +1,79 @@ | ||
|
||
# NemmiProtoUno Experiment Resources | ||
|
||
In this section you can find all resources need to implement your own instance of the experiment. | ||
|
||
|
||
|
||
For specific description of the structure of the experiment, I would urge you to read the paper describing it (see project root), as you will need to be aware of the rationale behind it to understand its structure. After you do, these files and their objective should make sense. | ||
|
||
|
||
|
||
In terms of hardware, you will need: | ||
|
||
|
||
|
||
- [ ] one handheld device (smartphone or tablet) running Android | ||
|
||
- [ ] the provided app installed in said device, with appropriate permissions already granted | ||
|
||
- [ ] one computer with Pure Data installed | ||
|
||
- [ ] good quality speakers | ||
|
||
- [ ] one device to record video and audio (e.g. smartphone, camera), ideally mounted on a tripod | ||
|
||
|
||
|
||
## Specific instructions | ||
|
||
### Mobile app | ||
|
||
The app should be running on the device you hand participants, so download the source code provided, compile it and make sure it runs on the device. The device should be connected to the same network as the computer running the PD patch. | ||
|
||
### Pure Data control patch | ||
|
||
This is how you will control the flow of the experiment. From this patch you will be triggering the sounds for the participant to listen and control the mobile app. The only thing you need to change is the IP to which the patch will send messages. After the device with the app is connected to the network, check its IP address and change it in the PD patch before connecting. You can send a *soundoff* message from it to check if communication is working. | ||
|
||
### Full experiment script | ||
|
||
Just as the name says, this is the full script you should follow for the experiment. Much like a movie script, all moments are detailed, and you should follow it - of course specific discourse can be adapted. | ||
|
||
### Participant details template | ||
This is an SPSS we provide, pre-formatted for input just as we used, and ready to run the syntax file in [Results and Analysis](https://github.com/Indeterminado/NemmiProtoUnoPublicExperiment/tree/main/Results%20and%20Analysis "Results and Analysis"). The easiest way would be to first write down participant answers on an Excel Spreadsheet or Word Document and then convert it to this format. | ||
|
||
Important things to know: | ||
This is an SPSS we provide, pre-formatted for input just as we used, and ready to run the syntax file in [Results and Analysis](https://github.com/Indeterminado/NemmiProtoUnoPublicExperiment/tree/main/Results%20and%20Analysis "Results and Analysis"). The easiest way would be to first write down participant answers on an Excel Spreadsheet or Word Document and then convert it to this format. | ||
|
||
Important things to know: | ||
|
||
1. Gesture list is based on our own instance of the experiment and may not cover all gestures you encounter. In that case you should edit the *Values* list for all corresponding mapping variables (*Variable View* tab has descriptions for that - check for the *uncategorized mappings* variables) | ||
|
||
1. Gesture list is based on our own instance of the experiment and may not cover all gestures you encounter. In that case you should edit the *Values* list for all corresponding mapping variables (*Variable View* tab has descriptions for that - check for the *uncategorized mappings* variables) | ||
2. Same goes for participant rationale. In the paper we describe the categorization we implemented based on the answers we collected, but you are free to expand on this (check *Variable View* tab for *Rationale* variables and update list) | ||
2. Same goes for participant rationale. In the paper we describe the categorization we implemented based on the answers we collected, but you are free to expand on this (check *Variable View* tab for *Rationale* variables and update list) | ||
|
||
|
||
|
||
*Variable View* descriptions should explain each variable's objective. | ||
For each participant you will have to fill in: | ||
|
||
- ID (this is just a numeric identifier) | ||
- Profile (musician VS non-musician) | ||
- Age | ||
- Gender | ||
- Attributed order for first three stimuli (read the paper) | ||
- Details for each of the stimuli | ||
- stimuli perception (X_PAR_PHASE) | ||
- gesture mapping (X_MAP_PHASE_L1) | ||
- gesture categorization (X_MAP_PHASE_L2) | ||
- mapping rationale (X_RSN_PHASE) | ||
|
||
X represents the stimulus id (a-e), PAR represents the musical parameter of that stimulus (PITch, DURation, AMPlitude) - L1 is the uncategorized gesture, L2 is the broad category of the gesture (read the paper), RSN means reason, and PHASE represents the phase number of the data (read the... ok, you get it) | ||
Stimulus D is the only one with a different variable structure, with the gesture variables following a format of (D_PARMAP_PHASE_L1). PAR follows the same logic as other stimuli. | ||
|
||
|
||
|
||
## Putting it all together | ||
If you go over the Full experiment script after reading the paper, you should be set to run the experiment yourself. If any doubts persist, feel free to drop me a message. | ||
|
||
If you go over the Full experiment script after reading the paper, you should be set to run the experiment yourself. If any doubts persist, or if explanation is too confusing, feel free to drop me a message. | ||
|
||
|
||
|
||
Alexandre Clément |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,11 +1,35 @@ | ||
|
||
# NemmiProtoUno Android App | ||
|
||
In this section you can find the Android Studio project for the application used in the study of how users map mobile handheld device gestures to musical parameters. | ||
It is a rather simple app, uses no external libraries, and should compile without problems. It implements functionalities for touch and accelerometer reading, as well as network communication (over UDP) and sensor data logging. Just open the **NemiProtoUno** folder with Android Studio and you should be good to go. | ||
|
||
It is a rather simple app, uses no external libraries, and should compile without problems. It implements functionalities for touch and accelerometer reading, as well as network communication (over UDP) and sensor data logging. Just open the **NemiProtoUno** folder with Android Studio and you should be good to go. | ||
|
||
|
||
|
||
All code and assets are released under the [GNU AFFERO GENERAL PUBLIC LICENSE](https://opensource.org/licenses/AGPL-3.0), if you use it in any way please respect it. And do let me know, I'm always interested in knowing how it might have been used! | ||
|
||
**DISCLAIMER:** although I did test this app on many devices, the nature of the Android ecosystem is such that it IS possible that a given device is unable to run this app. No guarantee is made regarding that, but feel free to adapt and correct it. | ||
|
||
|
||
## DISCLAIMER | ||
Although I did test this app on many devices, the nature of the Android ecosystem is such that it IS possible that a given device is unable to run this app. No guarantee is made regarding that, but feel free to adapt and correct it. | ||
|
||
|
||
|
||
For details on how to use this application in the context of the experiment, check section [Experiment resources](https://github.com/Indeterminado/NemmiProtoUnoPublicExperiment/tree/main/Experiment%20resources "Experiment resources") of this repository. | ||
|
||
### Other info | ||
If you want to access logged data the application creates, you can find log files under the *Android/com.feup.nemiprotouno/files/Nemmi_ProtoUno* directory on your device. | ||
Logs follow this naming pattern: NemmiProtoUnoLog_DATE_TIME_MILLISECONDS.txt | ||
|
||
Events are logged with a timestamp. Event code and values as follows: | ||
|
||
For details on how to use this application in the context of the experiment, check section [Experiment resources](https://github.com/Indeterminado/NemmiProtoUnoPublicExperiment/tree/main/Experiment%20resources "Experiment resources") of this repository. | ||
- Test ID (when you send the *test start* message for each stimuli this value increments) | ||
- accel X Y Z (accelerometer values on each axis) | ||
- touchStart ID X Y (thouch with ID started at X,Y coordinates) | ||
- move ID X Y (touch with ID moved to X, Y coordinates) | ||
- touchEnd ID X Y (touch with ID ended at X, Y coordinates) | ||
- shake (user shaked the device) | ||
|
||
|
||
Alexandre Clément |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,28 +1,4 @@ | ||
# NemmiProtoUno Experiment Protocol | ||
What is this? | ||
# NemmiProtoUno Results and Analysis | ||
In this section you can find all the results of the analysis we ran during our experiment, as well as the SPSS syntax file so you can run the same tests. | ||
|
||
This repository compiles all needed resources to run the experimental protocol detailed and presented in the article *"Musical Control Gestures in Mobile Handheld Devices: design guidelines informed by daily user experience"* (link to come as soon as it is published). | ||
|
||
This experimental protocol aims to allow for a systematized study of how users map mobile handheld device gestures to musical parameters. | ||
|
||
It is also part of the larger project *Nemmi*. | ||
|
||
## What is Nemmi? | ||
|
||
*Nemmi* stands for *Non-Emulative Mobile Musical Instrument* and is the PhD project of Alexandre Clément, Digital Media PhD candidate at [FEUP](https://www.fe.up.pt). | ||
Mobile musical instruments are abundant in the mobile app stores, but most either attempt to emulate traditional musical instruments or approach their interaction and interface design with very specific methods. There is no such thing as a uniformized way of interacting and controlling musical instruments on handheld devices, especially one that is designed with the particularities of those devices in mind. | ||
*Nemmi* is an attempt to exactly that: design a mobile musical instrument that is anchored in user-experience and developed with handheld device operation and specificities in mind. | ||
|
||
### What about NemmiProtoUno - what's that? | ||
While developing this project, several prototypes were created, at different stages, in order to test and evaluate different things. With the prototypes came the need to distinguish them, and so the nomenclature *NemmiProtoX* was adopted (no particular reason for that). | ||
*NemmiProtoUno* is the prototype developed in order to study how users would appropriate the device and represent musical parameters through common control gestures. | ||
|
||
## What is in this repository, then? | ||
Everything needed to reproduce the experimental test in order to verify or expand on the results. Or even expand towards testing with other parameters. We tried to understand how users instinctively mapped *Note Onset*, *Note Pitch*, *Note Duration* and *Note Amplitude*. The provided resources allow anyone to reproduce this testing, but can easily. | ||
- In the [Results and Analysis](https://github.com/Indeterminado/NemmiProtoUnoPublicExperiment/tree/main/Results%20and%20Analysis "Results and Analysis") section you can find the results for the experiment we ran. | ||
- In the [Experiment resources](https://github.com/Indeterminado/NemmiProtoUnoPublicExperiment/tree/main/Experiment%20resources "Experiment resources") you can find all necessary resources to run the same experimental protocol, as well as instructions on doing so. | ||
- In [Mobile app source code/NemiProtoUno](https://github.com/Indeterminado/NemmiProtoUnoPublicExperiment/tree/main/Mobile%20app%20source%20code/NemiProtoUno "This path skips through empty directories") you will find the Android Studio project for the app needed to run the experiment. | ||
|
||
And that's it. Each section has a specific readme file with details regarding its contents. Anything else just feel free to shoot me a message. | ||
|
||
Alexandre Clément | ||
Alexandre Clément |