Skip to content

Latest commit

 

History

History
121 lines (94 loc) · 5.15 KB

README.md

File metadata and controls

121 lines (94 loc) · 5.15 KB

smokie

give papa some data // computer vision applied to the world (for science and fun)

definitely not skynet. definitely.

approach // poc

seattle/oregon/cali wildfire shit

  • create crawlers for one stream of timestamped images of "open views"
  • save and compress images with metadata
  • correlate gps and surveillance
  • run labeled dataset through basic cnn model (and then the full vizinet model)
  • 🕳️

high-level research/motivation below. dunno if we'll get there

why?

🇫 🇺 🇳

track humans

  • social distancing compliance via covid-19

applying computer vision where it doesn't belong (maybe)

formulating earth air quality purely as a computer vision problem. sources of pm2.5 (particulate matter 2.5):

  • wildfires
  • factories
  • power plants
  • cars
  • cow burps and farts (yum)

this has benefits primarily for developing countries in place of maintaining expensive equipment for the same task. only proof point is making it more accurate so the tradeoff becomes clear and we can start using citizens as sensors for metrics that improve decision making for the good of the same populus

hypothesis: understanding exponential attenuation caused by particulate matter can help depth estimation systems and automonomous vehicles driving in adverse weather (not just on earth). potential for transfer learning here, but i could be wrong

data sources

ideal collection locations: seattle (today, for wildfires), beijing, india

realtime

static

how?

related

idk

indices

crawlers & surveillance

air quality

wildfire early detection