Template
%% Commands for TeXCount
%TC:macro \cite [option:text,text]
%TC:macro \citep [option:text,text]
%TC:macro \citet [option:text,text]
%TC:envir table 0 1
%TC:envir table* 0 1
%TC:envir tabular [ignore] word
%TC:envir displaymath 0 word
%TC:envir math 0 word
%TC:envir comment 0 0
\documentclass[sigconf]{acmart}
\citestyle{acmauthoryear}
%%
%% \BibTeX command to typeset BibTeX logo in the docs
\AtBeginDocument{%
\providecommand\BibTeX{{%
Bib\TeX}}}
\setcopyright{rightsretained}
\copyrightyear{2025}
\acmYear{2025}
\acmConference{SIGGRAPH Appy Hour '25}{August 10-14, 2025}{Vancouver, BC, Canada}
\acmBooktitle{Special Interest Group on Computer Graphics and Interactive Techniques Conference Appy Hour (SIGGRAPH Appy Hour '25), August 10-14, 2025}
\acmDOI{10.1145/3721260.3733980}
\acmISBN{979-8-4007-1552-5/2025/08}
\begin{document}
%%
%% The "title" command has an optional parameter,
%% allowing the author to define a "short title" to be used in page headers.
\title{ScavengeAR: From Licensing Fees to Free — Rebuilding Mobile AR with Unity-Native Tools}
\author{Victor Leung}
%%\email{thevictor2225@gmail.com}
\orcid{0009-0000-0600-668X}
%%\affiliation{%
%% \institution{Independent Developer}
%% \city{California}
%% \state{Mountain View}
%% \country{USA}
\begin{abstract}
ScavengeAR, aan conference-scaleaugmented ARreality creature-collecting gamemobile first,game, returns to SIGGRAPH in 2025 after a six-year hiatushiatus, withfeaturing a modernized tech stack and refined gameplay,gameplay. preservingThe reboot preserves the core player experience while eliminating costly third-party dependencies.
\end{abstract}
\begin{teaserfigure}
\includegraphics[width=\textwidth]{scavengearbanner}scavengear_teaserbanner.jpg}
\caption{Testing ScavengeAR’s 2017 mobile build ofin SIGGRAPH ScavengeAR 2017}Unity}
\Description{A person holds a smartphone displaying an augmented reality (AR) creature over a colorful marker pattern. Behind the phone, a computer screen shows the same marker being processed in Unity with a grid of image target icons. The AR creature is a teal, cartoon-style robot with yellow antlers and big eyes, floating above text describing SIGGRAPH’s AR showcase.}
\label{fig:teaser}
\end{teaserfigure}
%%\received{20 February 2007}
%%\received[revised]{12 March 2009}
%%\received[accepted]{5 June 2009}
\maketitle
\section{Introduction}
ScavengeAR was SIGGRAPH's official AR mobile app from 2017 to 2019, offering attendees an engaging Augmentedaugmented Realityreality experience and providing Luddy School of Informatics, Computing, and Engineering, Indiana University Indianapolis students with hands-on opportunities in interactive media development.
In the game, SIGGRAPH attendees choose a role — Artist, Scientist, or Educator — and explore the conference venue to discover printed image targets that, when scanned through the app, spawn 3D AR creatures into their physical surroundings. Like a digital safari, players use the in-app camera to photograph each creature, capturing them into a personal library. New creatures are released each day of the conference, encouraging daily exploration and unlocking new layers of narrative and prize opportunities over time. In addition, the attendee has their own creature based on their role, which encourages real interactions with other attendees in order to catch all the creatures for the best prizes.
\begin{figure}[ht]
\centering
\includegraphics[width=\linewidth]{scavengeAR_verticalphone1.png}scavengear_threephoneview.PNG}
\caption{ScavengeAR 2019 titleLos screenAngeles Convention Center map screen, profile screen, and profilerole view}share screen.}
\label{fig:teaser}
\end{figure}
\section{Motivation}
ScavengeAR ran from 2017-2019, with over a thousand daily active users during the conference. In 2020, the Covid Pandemic stopped in-person activity for a number of years, and effectively halted the app's development, leaving it in a deprecated state.
In 2025, a small volunteer team undertook a comprehensive refactoring of ScavengeAR to improve its maintainability and accessibility, in order to revive the spirit of the app to a modern audience.
\section{Technical Approach}
The original version of ScavengeAR was built using Unity 2018 and Vuforia 8 - a robust and accessible image tracking solution that was among the most practical options for AR based 2D markers at the time. Vuforia’s artist-friendly pipeline utilized a parent-child GameObject hierarchy within a single-scene architecture, allowing low-code AR development and rapid prototyping in a pre-ARKit/ARCore standardization era. However, Vuforia also required licensing fees for production deployment and cloud-hosted image target storage.
For the 2025 relaunch, ScavengeAR was refactored to using Unity 2022 and AR Foundation 5 — Unity’s native, cross-platform AR framework built on top of ARKit and ARCore — to support lightweight, offline 2D image tracking and reduce long-term costs, build size, and technical dependencies.
Migrating to AR Foundation required an architecturala shift —to fromits modular, multi-scene, prefab-driven architecture. While we traded Vuforia’s monolithic, parent-child single-scene model to AR Foundation’s modular, multi-scene prefab-driven design. In exchange for Vuforia’s ease of use,simplicity, the new system wasoffered capable of betterimproved performance management and a clearer separation of concerns. AR Foundation also supports full in-editor XR simulation (simulation—replacing webcam-based editorworkflows—which workflows)proved andessential morefor performant URP integration instead of Built-in Renderer. This was crucial inrapidly iterating quickly to modernizeon the modernized photo capture gameplay experience.
Originally built using GameSparks for backend services and a licensed data binding plugin for UI management, we replaced GameSparks with Unity's built-in serialization for offline data handling, and transitioned to an open-source MVVM Toolkit. These changes not only further reduced external dependencies and licensing costs but also simplified the codebase, making it more approachable for new contributors.
While the core functionality remains consistent with previous iterations, this overhaul ensures that ScavengeAR can continue to be a sustainable and educational project. By leveraging Unity's modern XR capabilities and open-source tools, we've positioned the app for easier updates and potential future enhancements, aligning with our goal of providing valuable learning experiences for students and enjoyable interactions for conference attendees.
\section{Art and Design}
\begin{figure}[ht]
\centering
\includegraphics[width=\linewidth]{newsigglets.scavengear_newcreatures.jpg}
\caption{New creatures for future ScavengeAR}
\label{fig:teaser}
\end{figure}
Luddy School of Informatics, Computing, and Engineering, Indiana University Indianapolis students has historically createdcontributed neworiginal 3D creaturescreature designs for each iteration of Scavenge AR,ScavengeAR, and they havereturn returnedonce toagain dowith so in 2025. Oura new 3Dcast of characters. This year’s creatures havewere been createddeveloped using thea traditional animation and visual effects pipeline.pipeline, Incontinuing the future,project’s tradition of experimenting with new styles and creative direction in each version.
For the user experience, we hoperetained the core interaction flow from ScavengeAR 2019, which has been iteratively tested and refined through multiple conference deployments. While earlier versions of the app featured playful mini-games like Triviatron and Sigglet Falls, these proved non-essential to ethicallythe exploreprimary generativegameplay AIloop. Instead, our focus shifted to create creatures.
As for Design, we focused on improving the cameraAR capture photo-taking experience. We are able to add shutter and polaroid fade in animation and modernizeexperience—updating the camera GUIoverlay to matchbetter suit modern phonetall-screen interactionsphones, asadding wella astactile enhanceshutter ouranimation, and introducing a polaroid-style fadeout to reinforce the analog camera motif.
Some examples. A paginated journal article \cite{Abril07}, an
enumerated journal article \cite{Cohen07}, a reference to an entire
issue \cite{JCohen96}, a monograph (whole book) \cite{Kosiur01}, a
monograph/whole book in a series (see 2a in spec. document)
\cite{Harel79}, a divisible-book such as an anthology or compilation
\section{Future Work}
\begin{figure}[ht]
\centering
\includegraphics[width=\linewidth]{siggraph-scavengear-doggo-p-1600.2018creature.png}
\caption{A 2018 ScavengeAR creature, originally featured at the SIGGRAPH Studio venue in the Vancouver Convention Centre, shown here without the mobile user interface.}
\label{fig:teaser}
\end{figure}
Mobile AR has historically offered a more accurate representation of Augmentedaugmented Realityreality by leveraging the raw camera feed to detect physical markers in the environment. At the time, Mixed Reality headsets restricted access to raw passthrough video, limiting similar capabilities. That trend is now shifting, with newer headsets offering improved access to video passthrough.
As we look ahead, a key challenge will be unifying the user experience (UX), user interface (UI), and interaction patterns across both mobile and headset-based platforms. With our latest refactor, we’re well-positioned to support both moving forward.
\begin{acks}
ScavengeAR could not have been created without the significant contributions of: Casey Kwock, Victor Leung, Thach Nguyen, Louie Whitesel, and Zeb Wood. Additionally, we appreciate the supporting contributions of Jose Garrido, Spencer Hayes, Alli Johnson, and Andy WangWang.
We also thank the many volunteers who helped bring previous versions of the game to life, as well as the donors whose generosity helped cover software licensing costs throughout the project’s development.
Finally, we extend our gratitude to the players across SIGGRAPH conferences whose enthusiasm and feedback have continually shaped the experience.
\end{acks}
%%
%% The next two lines define the bibliography style to be used, and
%% the bibliography file.
%%\bibliographystyle{ACM-Reference-Format}
%%\bibliography{sample-base}
%%
%% If your work has an appendix, this is the place to put it.
\appendix
\end{document}
\endinput
%%
%% End of file `sample-sigconf.tex'.
which is to be published in siggraphappyhour25 to The ACM Publishing System (TAPS). The system will be sending 3 reminders within 120 HRS of sending the first notification to upload inputs of your paper. Please upload your files within this time frame, to avoid support delays due to last minute rush.
Please use this link to upload the zip file of your paper for processing, the instructions regarding how to prepare your zip file for submission are posted at the top of the page:
http://camps.aptaracorp.com/
Note: Please make sure to verify your paper details from the option “CHECK PAPER DETAILS” provided at the mentioned link, before you upload the zip file, as that information will be printed in your paper?s HTML version.
In addition, any supplements/videos associated with your paper will need to be uploaded using a separate link mentioned below:
https://cms.acm.org/
Should you have any issues preparing or uploading the zip file for your paper, please lodge a ticket using “Contact Support” form on your TAPS dashboard; If you face any problem while adding the ticket, then you can also contact confsupport@aptaracorp.com for assistance.
Sincerely,
ACM Production
******************************
Do not reply to this message as emails are not monitored at this account.
******************************
https://github.com/mdmzfzl/NeetCode-Solutions
def solve_problem(inputs):
# Step 1: Understand the Problem
# - Parse inputs and outputs.
# - Clarify constraints (e.g., time, space).
# - Identify edge cases.
# Step 2: Plan and Design
# - Think about the brute-force approach.
# - Optimize: Can you use dynamic programming, divide & conquer, etc.?
# - Choose the appropriate data structures (e.g., arrays, hashmaps, heaps).
# Step 3: Implement the Solution
# - Use helper functions for modularity.
# - Write clear, well-commented code.
def helper_function(args):
# Optional: For recursion, BFS, DFS, etc.
pass
# Main logic
result = None # Initialize result or output variable.
# Example logic
# for num in inputs:
# result += num # Or other computations.
return result
# Driver code (for testing locally)
if __name__ == "__main__":
inputs = [] # Replace with example test cases.
print(solve_problem(inputs))