Chenxuan Sun cw3902@nyu.educhenxuan_ss

A



Introduction


Live-Diffusion is an interactive XR system that explores how sensor data can shape AI-powered creative expression. Instead of staying broad, we focus on connecting AI, digital interface with real-time environment. By integrating temperature and humidity sensors, our system allows AI image generation to directly respond to surrounding conditions, so that the outputs become more dynamic and context-aware. Combined with TouchDesigner and ComfyUI, we aim to make this interaction intuitive and accessible, encouraging creators to experience how environmental data can actively influence generative visuals.

Live-Diffusion 是一个交互式 XR 系统,主要探索环境因素是如何影响AI生成。我们使用温度与湿度传感器,将周围环境实时影响图像生成的输入。这样一来,生成的画面不仅与用户互动,更与环境产生呼应。我们结合 TouchDesigner 与 ComfyUI,提供清晰直观的操作界面,让创作者无需复杂技术门槛即可参与创作。我们的目标是展示 AI 图像生成如何与现实环境紧密连接,打造更具沉浸感和包容性的创意体验。



B



NYU  GLOBAL SHOW N TELL


A collaborative initiative with NYU's global campuses and the ITP Graduate School, Live-Diffusion was deployed across five sites in New York, Abu Dhabi and Shanghai. The project distinguished itself by enabling real-time AI reinterpretation of globally streamed camera feeds, earning it a place in NYU's AI education exhibition and the esteemed Goddard Global Impact Award.

Live-Diffusion项目与纽约大学全球多个校区及ITP研究生院合作,在跨越3个时区的5个站点同步展出。系统成功实现了全球摄像头的实时接入与AI动态转绘,并作为纽约大学AI教育成果受邀参展,荣膺Goddard全球影响力奖。


C



LIVE-DIFFUSION in HARVARD XR


Live-Diffusion was invited to exhibit at the Harvard XR 2025, hosted by the Harvard, and presented its gesture-interactive version at GeorgeGund Hall in Boston.

项目受邀参展哈佛大学GSD研究生院主办的Harvard XR 2025峰会,并在波士顿George Gund Hall展示了手势交互版本

https://m.youtube.com/playlist?list=PLBlfE2YKQnrhoQFCvviV7ACbKLeFFiItd&noapp=1


D



冲!周末艺术营地

Participated in a pop-up game event at Xinhua Community to collect user experience feedback and observe player interactions in a real-world setting.



E



NYUSH 2024 IMA SHOW

This updated version of the piece incorporates real-time environmental data—temperature and humidity—captured by sensors, alongside an added sound component. These enhancements aim to make the work more interactive and globally connected. By using climate data and user-generated sound, the piece responds to local conditions from different parts of the world. Temperature and humidity are universally relatable elements—like the common question, “What’s the weather like where you are?”—fostering a shared experience across distant locations.

新版本加入了无法通过摄像头捕捉的环境数据——温度与湿度——通过传感器获取,并增加了声音元素。“今天天气如何”是人们常常会问的问题——温度与湿度也成为人们交流和感受的主要因素。 通过气候数据回应世界各地的本地环境, 营造出一种跨越距离的共享体验。