commits made while instructing EECS 1720 - Building Interactive Systems (winter 2022) (course @York University, Canada)
This is the end of EECS 1720 - Building Interactive Systems! We have done so much and come so far!
A second course teaching more advanced programming concepts within the context of image, sound and interaction using an object-oriented language; introduction to interactive systems, user interfaces, event-driven programming, object design and inheritance; implementation using debuggers, integrated development environments, user interface builders.
Winter 2022: Augmented Reality Class Exhibit.
- To form by combining materials or parts; construct.
- To form by ordering and uniting materials by gradual means into a composite whole
- To develop or give form to according to a plan or process; create.
- To develop according to a systematic plan, by a definite process, or on a particular base
- involving the actions or input of a user especially : of, relating to, or being a two-way electronic communication system (such as a telephone, cable television, or a computer) that involves a user's orders (as for information or merchandise) or responses (as to a poll)
- to act upon one another
- Interactivity is the communication process that takes place between humans and computer software.
- A group of interacting, interrelated, or interdependent elements forming a complex whole, especially.
- An organism as a whole, especially with regard to its vital processes or functions.
- collection of elements or components that are organized for a common purpose.
- the organization or plan itself (and is similar in meaning to method, as in "I have my own little system") and sometimes describes the parts in the system (as in "computer system")
- User Interfaces (UIs), UI Elements, Guidelines for UI design
- User Interface Builders, Integrated Development Environments
- Objects, classes and inheritance
- Interactive WWW -based systems - basic network concepts, guidelines for design
- Event driven programming
- Intro to threads and/or asynchronous event handling
- Designing engaging interactive systems, games, etc.
- IFTTT concepts and bootstrapping
- basic server-client models, browser extensions, Web APIs
- real-time networking
An extension adds features and functions to a browser. It's created using familiar web-based technologies—HTML, CSS, and JavaScript. It can take advantage of the same web APIs as JavaScript on a web page, but an extension also has access to its own set of JavaScript APIs. This means that you can do a lot more in an extension than you can with code in a web page.
- The act of extending or the condition of being extended
- Enhance or complement a website
- Let people show their personality
- Add or remove content from web pages:
- Add tools and new browsing features
- Games
- Add development tools
Application programming interfaces, or APIs, simplify software development and innovation by enabling applications to exchange data and functionality easily and securely.
- An application programming interface, or API, enables companies to open up their applications’ data and functionality to external third-party developers, business partners, and internal departments within their companies.
- This allows services and products to communicate with each other and leverage each other’s data and functionality through a documented interface.
- Developers don't need to know how an API is implemented; they simply use the interface to communicate with other products and services.
- API use has surged over the past decade, to the degree that many of the most popular web applications today would not be possible without APIs.
Python is a high-level, general-purpose programming language. Its design philosophy emphasizes code readability with the use of significant indentation. Its language constructs and object-oriented approach aim to help programmers write clear, logical code for small- and large-scale projects.
- Python is powerful... and fast;
- plays well with others;
- runs everywhere;
- is friendly & easy to learn;
- is Open
JavaScript, often abbreviated JS, is a programming language that is one of the core technologies of the World Wide Web, alongside HTML and CSS. Over 97% of websites use JavaScript on the client side for web page behavior, often incorporating third-party libraries.
JavaScript is a programming language used primarily by Web browsers to create a dynamic and interactive experience for the user. Most of the functions and applications that make the Internet indispensable to modern life are coded in some form of JavaScript!
XR is a new buzzword in the tech world, but it’s pretty simple to understand. XR is an umbrella term that rolls in VR, AR, and MR. In a nutshell, XR technology is any tech that takes your display and makes it more immersive, or makes it interact with your real-world surroundings in some way. And it truly is changing the future.
- extended reality:
XR
- virtual reality:
VR
- augmented reality:
AR
- mixed reality:
MR
- virtual reality:
For decades, we have attempted to bridge the gap between the digital and physical worlds. XR is a superset which includes the entire spectrum from the complete real
to the complete virtual
- concept of reality–virtuality continuum
- lies in the extension of
human experiences
especially relating to thesenses of existence
(represented by VR) and theacquisition of cognition
(represented by AR) - Continuous development in human–computer interactions means this concept is still evolving...
Augmented reality is a technique that adds layers to the world as we are used to it. These layers could be visual, auditory and sensory information to intensify your experience.
Even though you can’t change the world you’re living in, augmented reality makes it possible to give your surroundings an extra dimension. By using images, sounds, texts or even GPS data, you can enrich the place you’re in. It is key that these elements are presented spatially to affect your depth perception.
The AR technique has a certain degree of power, convincing your brain that those elements really exist in your environment.
- CLO1: Conduct meta-design (design a system to design things).
- CLO2: Integrate computational techniques for content generation (or filtering) in a social media context. Differentiate and appraise the aesthetic of systems for generating public performances.
- CLO3: Differentiate and appraise the aesthetics, design, and concept of systems that support collaboration and emergent creative behaviour. Implement fundamental data structures to design an interface, balancing constraints and incentives for participation.
- CLO4: Review protocols for website structure and display. Identify and creatively experiment with Internet browser functionality and APIs.
- CLO5: Apply generative design principles to the design of 3D form. Discuss form in relation to the contexts of the body, society, or the environment.
- CLO6: Develop, design, and execute a creative intervention at a specific site.
- something a bit familar
- processing, IDE
- similar structure but introduction of Python and JavaScript
- JSON, CLI, GUI
- systems for building ... systems !
- use of OOP in Javascript and Python
- robots! families so cute!
- interaction dynamics with p5js
- start to build systems that can run..
- locally
- hosted
- how to interact?
- permissions
- extensions
- next step in permissions? APIs
- tokens, connecting
- the developers method of interacting between existing systems
- Building custom access to platforms that you know and that you use
- What can we do? What do you want to do? What does Digital Media ask?
- art, access, content, custom, unique, technical!
- you need to develop skills to be able to create your own ideas and manifest them!
- Public content
- adding extensions, connecting to social platforms, hosting your own content
- Cross-platform, cross-device - web-based
- open source! see how to build so that you can build!
- XR is future, content is immersive, we are just beginning to explore this
- Can't wait to see the final projects!
In live_code/
:
- most recently
AR-OTHER_week12/
contains the simple face tracking, image tracking (vs marker tracking), and soon the hand tracking - Updates on combining more of the systems together to build differing levels of interactivity will be looked at in our last lab session tomorrow (Wednesday April 6 2022)
- Also, Phase 3 - final submission of Group Project requirements - will be detailed in that lab session
- FYI Immersive Web working Group
- cross API topics for Augmented Reality (or where decisions about stuff happen)
I have moved all code (not including today yet) and lecture notes we have done during lecture, quizzes, lab projects, and from the GroupProject folder where I updated the AR code, all is now copied to the live_code/
and Lecture_notes_to_review/
folders in our main EECS_1720
repo.
I am going through the repo to update any files that need updating or to add any lecture notes. Also note the following last week before reading week plan.
For the week:
- Monday: Try to get Phase 1 submitted or at least let me know if there are still group member issues
- Tuesday:
- adding processing content to our social bot example
- this way students can have the option of separating what they want to update their social bot with .. from the social bot access
- if you want to try and go for a whole system build/updating from same program/file etc. go for it!
- otherwise we will explore processing.py options to keep things familiar
- (Tuesday/Thursday) review previous content for OOP classes in JavaScript (p5.js) and Python (processing.py)
- adding processing content to our social bot example
- Wednesday: more social bot code
- Should have grades back and will review before posting to eClass - expect to have the following released over next couple of days:
- Lab 1 - part a) and part b)
- Lab 2 - part a)
- Quiz 1
- 2 checks of 100 days (so 2% of the 10%)
- Lab will work through social bot code and design
- by the end of lab or within a few days students should decide if they want to try to integrate into an alternative social feed or else use the twitter example; we will work through adding tokens for a full working twitter example in this case so that you can always revert back to this type of social bot
- Should have grades back and will review before posting to eClass - expect to have the following released over next couple of days:
- Thursday:
- we will setup a simple local host server
- (Tuesday/Thursday) review previous content for OOP classes in JavaScript (p5.js) and Python (processing.py)
- Quiz 2 released (we will not do it in lecture - you will have the next day to complete it)
- Friday:
- Hand in Quiz 2 by
EOD ([E]nd [O]f [D]ay)nah we are having too much fun with it. TheRobot family
is sticking around untilFamily Day
. I swear I did not plan this. Phase 2 group project information releasedSince we are having so much fun withQuiz 2
andLab project 2
..Phase 2
will be released later during RW- Grades returned if not already done so
- Hand in Quiz 2 by
A reminder that all code developed and demoed during class will be updated in the folder live_code
available in this repo. Note that it will be live
so you can copy-paste immediately, but that the content itself might be organized or rearrange after lecture.
The code will remain in live_code
until the next lecture - at which point it will be moved to the Content_by_Topic
and Content_by_Week
folders so that our live_code
folder can be emptied and ready for the next lecture's code content.
Specifically: video files of any live code will be made available ASAP (I have to wait for zoom to send me a 'recording is ready' notification - then I can grab, edit if needed, and share the file). Videos will be available either directly from zoom (if there is no need to edit) otherwise they can be found in our Lecture_Videos drive and/or eClass
-
In response to feedback from our questionnaire content will now be arranged by both:
- Topic
- Week
The content will be the same - just organized differently, so, depending on your preference, you can look content up by topic (python, JavaScript, lecture notes etc.) or by week (follows the course syllabus regarding content)
-
I have also added some clarification in past files regarding what is for advanced students and not for those just starting. Moving forward I will indicate during lecture and in any lecture notes where the content diverges and depending on if we are doing a
deeper dive
orjust starting
, you will know within what context we are working and can tune out or tune in accordingly (but keep playing with the code in either case!).- Digital Media is always a mix of people often with large differences in kind and type of knowledge and experience
- it is expected that you will have different skills
- it is expected that you will learn at different paces
- you are expected to improve as compared to yourself (and not other students!)
-
Since we are now all setup with repos and have some familiarity working with code files and folders - lectures will start exploring, in much more detail, the aspects of
sound
,image
, andinteraction
directly intolive code
examples where students are expected to follow alongactively
. This process will be worked from both processing.py (python mode in processing v3) and carried into JavaScript with p5.js. This will let us compare the codes, and, because of the direct connection to the processing framework, both languages will maintain very similar structure, function names, and reference points. Now we can start exploring the underlying patterns in our code's logic, output, and general form. -
We will generally always see content within a processing context, so processing.py / python mode in the
PDE
(recall: [P]rocessing [D]evelopmet [E]nvironment) when working with python, and p5.js when working with JavaScript. When working through code in the p5.js context, anyindex.html
files required for the JavaScript examples will be identical other than changing the name of thesketch.js
file if we are using more than one. Any libraries will be provided or direct instructions on how to load or add them will be mentioned. -
We will at times develop (individual pace) both
python
andJavaScript
content outside ofprocessing
. Some of our python exploring will be done as individual.py
files, and more and more of our JavaScript exploring will be extended beyond p5.js.
Files and folders are updated.
-
docs/ contains
- our
.md
files that you should probably (definitely) look at
- our
-
JavaScript/ contains
extension sample code (Lab 1)- three working examples of where to
JavaScript-in-HTML
-
processing/ contains
- p5js/
- stand-alone local browser example
- python-mode/
- python version of supplied p5js example
- p5js/
-
misc/ contains
- some additional
txt
files for installation help (incomplete for allOS
but will be updated when and if we need to) - an
img
folder to collect found content (don't worry if it's important it won't stay in the misc folder) - weekly updates otherwise not found in
docs/
- (where those
.md
files that you should probably look at reside)
- (where those
- some additional
-
demos/ contains
- extension sample code (Lab 1)
- with
HTML
- with
HTML
andJavaScript
- with
- extension sample code (Lab 1)
- live content will be cleaned, edited, and described in logfile and code comments each week
- don't worry about the GitHub CLI (command line interface) files.. we will look at git and/or gh during Week 2
- it's there if you're interested though
course syllabus with updated due dates is now available
browser extension demo is in .. extension-demo/
Be sure to check additional info in extension-debugINFO.md
recall the different ways to -> chrome://extensions or about:addons more in-depth version will be in .. more-extension-demo/
python mode for processing is in PYTHON/
you should have already installed processing from 1710 - if not instructions are in the folder for now we will run and edit python sketches from within the Processing Development Environment (PDE)
p5js for processing is in P5JS/
this is a bit more work to set up as we will run our own http-servers for now you can copy the sketch code into the p5.js editor on the website and start playing around if you really want to get into the self-serve .. there are some instructions included in the folder
This README file is formatted with Markdown :)
(but web references are often only added to our .md
files found in docs/
)
- (yes the
.md
files you should probably look at)
https://github.github.com/gfm/
https://www.theserverside.com/answer/Git-fork-vs-clone-Whats-the-difference