AI & Agents: Intelligent Environments

Spacial interactions between humans and autonomous machines in the "smart" cites of tommorow.

Design Brief

"As commercial interests and law enforcement agencies begin to implement urban infrastructures imbued with the capacity to remember, correlate, and anticipate, we find ourselves on the cusp of a near-future city capable of reflexively monitoring its environment and our behavior within it, becoming an active agent in the organization of everyday life. To the extent that these technologies (and how we use them) influence how we experience the city and the choices we make there..." — Sentient City: Ubiquitous Computing, Architecture, and the Future of Urban Space

Use Unity as a collaborative design tool to conceptualize, build, train, and craft AI/Agents to explore the future of "smart" environments. These prototyped experiments will form the basis of a new user experience proposition and enable collaboration with AI in real-time.

Approach

We used a strategic prototyping process of "thinking through making" as a form of design research. Placing technology at the center (rather than the human), we used machine learning and agentive software via Unity as a way to learn about artificial intelligence and the process of ML authorship.

Walk Quote

Navigation Agent

By experimenting with "Nav-Agent" in Unity, we imagined how mobility might be more seamless in hyperconnected environments.

We generated more organic pedestrian flows at intersections and strived to provide as much accessibility as possible.

We asked ourselves how might we craft spatial interactions between humans and autonomous machines — agentive goals, data collection, learning curriculum, user experience that span from 1 second to 10 years, new inputs (chat, voice, face), sensor ranges, and increasingly invisible affordances?

For the agents to understand and predict pedestrian's movement, we focused on points of pedestrian attention and focus levels.

We explored ways to effectively communicate with pedestrians, and to vie for their attention within hypercities. Raycasting allowed us to simulate eye movement and attention.

System Diagram - Long Term

System Diagram

Archetypes and Context Types

We separated users into two segments (deterministic and non-deterministic) based on their habits. Time became a critical differentiator, while audio and visual became signifiers for both use cases. Similarly, we differentiated between densely populated and sparsely populated environments, where users would shift between prioritizing jaywalking and wayfinding.

Walker Runner

Results

We designed a multi-level system that addresses pedestrian flow in two parts. On the city-level the system works to facilitate crowd movement; on the individual level, it helps pedestrians to move more freely throughout the city.

Mobile Application(s)

The platform might issue notifications, letting users know when it is safe to cross a thoroughfare. Likewise, it might leverage existing AR technology to effectively augment the use case of popular map applications.

TikTok Screen (Part 1) TikTok Screen (Part 2)