Template
%% Commands for TeXCount
%TC:macro \cite [option:text,text]
%TC:macro \citep [option:text,text]
%TC:macro \citet [option:text,text]
%TC:envir table 0 1
%TC:envir table* 0 1
%TC:envir tabular [ignore] word
%TC:envir displaymath 0 word
%TC:envir math 0 word
%TC:envir comment 0 0
\documentclass[sigconf]{acmart}
%%
%% \BibTeX command to typeset BibTeX logo in the docs
\AtBeginDocument{%
\providecommand\BibTeX{{%
Bib\TeX}}}
\setcopyright{rightsretained}
\copyrightyear{2025}
\acmYear{2025}
\acmConference{SIGGRAPH Appy Hour '25}{August 10-14, 2025}{Vancouver, BC, Canada}
\acmBooktitle{Special Interest Group on Computer Graphics and Interactive Techniques Conference Appy Hour (SIGGRAPH Appy Hour '25), August 10-14, 2025}
\acmDOI{10.1145/3721260.3733980}
\acmISBN{979-8-4007-1552-5/2025/08}
\begin{document}
%%
%% The "title" command has an optional parameter,
%% allowing the author to define a "short title" to be used in page headers.
\title{ScavengeAR: From Licensing Fees to Free — Rebuilding Mobile AR with Unity-Native Tools}
\author{Victor Leung}
%%\email{thevictor2225@gmail.com}
\orcid{0009-0000-0600-668X}
%%\affiliation{%
%% \institution{Independent Developer}
%% \city{California}
%% \state{Mountain View}
%% \country{USA}
}
\begin{abstract}
ScavengeAR, a conference-scale AR creature-collecting game first, returns in 2025 after a six-year hiatus with a modernized tech stack and refined gameplay, preserving the core player experience while eliminating costly third-party dependencies.
\end{abstract}
\begin{teaserfigure}
\includegraphics[width=\textwidth]{scavengearbanner}
\caption{Testing Earlymobile Mobile Buildbuild of SIGGRAPH ScavengeAR 2017}
\Description{A person holds a smartphone displaying an augmented reality (AR) creature over a colorful marker pattern. Behind the phone, a computer screen shows the same marker being processed in Unity with a grid of image target icons. The AR creature is a teal, cartoon-style robot with yellow antlers and big eyes, floating above text describing SIGGRAPH’s AR showcase.}
\label{fig:teaser}
\end{teaserfigure}
\received{20 February 2007}
\received[revised]{12 March 2009}
\received[accepted]{5 June 2009}
\maketitle
\section{Introduction}
ScavengeAR was SIGGRAPH's official AR mobile app from 2017 to 2019, offering attendees an engaging Augmented Reality experience and providing Indiana University students with hands-on opportunities in interactive media development.
In the game, SIGGRAPH attendees choose a role — Artist, Scientist, or Educator — and explore the conference venue to discover printed image targets that, when scanned through the app, spawn 3D AR creatures into their physical surroundings. Like a digital safari, players use the in-app camera to photograph each creature, capturing them into a personal collection.library. New creatures are released each day of the conference, encouraging daily exploration and unlocking new layers of narrative and prize opportunities over time. In addition, the attendee has their own creature based on their role, which encourages real interactions with other attendees in order to catch all the creatures for the best prizes.
\begin{figure}[ht]
\centering
\includegraphics[width=\linewidth]{scavengeAR_verticalphone1.png}
\caption{ScavengeAR 2019 Titletitle Screenscreen and Profileprofile View}view}
\label{fig:teaser}
\end{figure}
\section{Motivation}
ScavengeAR ran from 2017-2019, with over a thousand daily active users during the conference. In 2020, the Covid Pandemic stopped in-person activity for a number of years, and effectively halted the app's development, leaving it in a deprecated state.
In 2025, a small volunteer team undertook a comprehensive refactoring of ScavengeAR to improve its maintainability and accessibility, in order to revive the spirit of the app to a modern audience.
\section{Technical Approach}
The original version of ScavengeAR was built using Unity 2018 and Vuforia 8 - a robust and accessible image tracking solution that was among the most practical options for AR based on 2D markers at the time. Vuforia’s artist-friendly pipeline utilized a parent-child GameObject hierarchy within a single-scene architecture, allowing low-code AR development and rapid prototyping in a pre-ARKit/ARCore standardization era. However, Vuforia also required licensing fees for production deployment and cloud-hosted image target storage.
In addition, Vuforia’s roadmap has shifted towards 3D object and model tracking.
For the 2025 relaunch, ScavengeAR was reengineeredrefactored using Unity 20232022 and AR Foundation 5 — Unity’s native, cross-platform AR framework built on top of ARKit and ARCore — to support lightweight, offline 2D image tracking and reduce long-term costs, build size, and technical dependencies.
Migrating to AR Foundation required an architectural shift — from Vuforia’s monolithic, parent-child single-scene model to AR Foundation’s modular, multi-scene prefab-driven design. In exchange for Vuforia’s ease of use, the new system was capable of better performance management and clearer separation of concerns. AR Foundation also supports full in-editor XR simulation (replacing webcam-based editor workflows) and more performant URP integration instead of Built-in Renderer. This was crucial in iterating quickly to modernize the photo capture gameplay experience to modern day.experience.
Originally built using GameSparks for backend services and a licensed data binding plugin for UI management, we replaced GameSparks with Unity's built-in serialization for offline data handling, and transitioned to an open-source MVVM Toolkit. These changes not only reduced external dependencies and licensing costs but also simplified the codebase, making it more approachable for new contributors.
While the core functionality remains consistent with previous iterations, this overhaul ensures that ScavengeAR can continue to be a sustainable and educational project. By leveraging Unity's modern XR capabilities and open-source tools, we've positioned the app for easier updates and potential future enhancements, aligning with our goal of providing valuable learning experiences for students and enjoyable interactions for conference attendees.
\section{Art and Design}
\begin{figure}[ht]
\centering
\includegraphics[width=\linewidth]{siggraph-scavengear-doggo-p-1600.png}newsigglets.jpg}
\caption{CreatureNew foundcreatures infor ScavengeARfuture 2018 in Vancouver}ScavengeAR}
\label{fig:teaser}
\end{figure}
Indiana University Bloomington has historically created new 3D creatures for each iteration of Scavenge AR, and they have returned to do so in 2025. WeOur proudly proclaim that ournew 3D creatures have been created using the traditional animation and visual effects pipeline. However,In the future, we didhope useto OpenAIethically chatgptexplore forgenerative occasionalAI critiquesto andcreate image-to-image refinements.creatures.
As for Design, we focused on improving the camera capture taking experience. We are able to add shutter and polaroid fade in animation and modernize the camera GUI to match modern phone interactions as well as enhance our analog camera motif.
The use of \BibTeX\ for the preparation and formatting of one's
references is strongly recommended. Authors' names should be complete
--- use full first names (``Donald E. Knuth'') not initials
(``D. E. Knuth'') --- and the salient identifying features of a
reference should be included: title, year, volume, number, pages,
article DOI, etc.
The bibliography is included in your source document with these two
commands, placed just before the \verb|\end{document}| command:
\begin{verbatim}
\bibliographystyle{ACM-Reference-Format}
\bibliography{bibfile}
\end{verbatim}
where ``\verb|bibfile|'' is the name, without the ``\verb|.bib|''
suffix, of the \BibTeX\ file.
Citations and references are numbered by default. A small number of
ACM publications have citations and references formatted in the
``author year'' style; for these exceptions, please include this
command in the {\bfseries preamble} (before the command
``\verb|\begin{document}|'') of your \LaTeX\ source:
\begin{verbatim}
\citestyle{acmauthoryear}
\end{verbatim}
Some examples. A paginated journal article \cite{Abril07}, an
enumerated journal article \cite{Cohen07}, a reference to an entire
issue \cite{JCohen96}, a monograph (whole book) \cite{Kosiur01}, a
monograph/whole book in a series (see 2a in spec. document)
\cite{Harel79}, a divisible-book such as an anthology or compilation
\section{Future Work}
\begin{figure}[ht]
\centering
\includegraphics[width=\linewidth]{newsigglets.jpg}siggraph-scavengear-doggo-p-1600.png}
\caption{NewA Creatures2018 toScavengeAR becreature, captured}originally featured at the SIGGRAPH Studio venue in the Vancouver Convention Centre, shown here without the mobile user interface.}
\label{fig:teaser}
\end{figure}
WeMobile hopeAR has historically offered a more accurate representation of Augmented Reality by leveraging the raw camera feed to revisitdetect physical markers in the Generative AI pipeline to build. This will open to new avenue senvironment. At the moment,time, Mixed Reality headsets restricted access to raw passthrough video, limiting similar capabilities. That trend is now shifting, with newer headsets offering improved access to video passthrough.
As we arelook notahead, usinga key challenge will be unifying the cloud,user butexperience hope(UX), user interface (UI), and interaction patterns across both mobile and headset-based platforms. With our latest refactor, we’re well-positioned to addsupport itboth backmoving it in at a later date.
Though we revived the core gameplay loop, we removed photo mode and interactive ar tutorial.forward.
\begin{acks}
ToScavengeAR Robert,could fornot have been created without the bagelssignificant contributions of: Casey Kwock, Victor Leung, Thach Nguyen, Louie Whitesel, and explainingZeb CMYKWood.
Additionally, we appreciate the supporting contributions of Jose Garrido, Spencer Hayes, Alli Johnson, and colorAndy spaces.Wang
\end{acks}
%%
%% The next two lines define the bibliography style to be used, and
%% the bibliography file.
\bibliographystyle{ACM-Reference-Format}
\bibliography{sample-base}
%%
%% If your work has an appendix, this is the place to put it.
\appendix
\end{document}
\endinput
%%
%% End of file `sample-sigconf.tex'.
which is to be published in siggraphappyhour25 to The ACM Publishing System (TAPS). The system will be sending 3 reminders within 120 HRS of sending the first notification to upload inputs of your paper. Please upload your files within this time frame, to avoid support delays due to last minute rush.
Please use this link to upload the zip file of your paper for processing, the instructions regarding how to prepare your zip file for submission are posted at the top of the page:
http://camps.aptaracorp.com/
Note: Please make sure to verify your paper details from the option “CHECK PAPER DETAILS” provided at the mentioned link, before you upload the zip file, as that information will be printed in your paper?s HTML version.
In addition, any supplements/videos associated with your paper will need to be uploaded using a separate link mentioned below:
https://cms.acm.org/
Should you have any issues preparing or uploading the zip file for your paper, please lodge a ticket using “Contact Support” form on your TAPS dashboard; If you face any problem while adding the ticket, then you can also contact confsupport@aptaracorp.com for assistance.
Sincerely,
ACM Production
******************************
Do not reply to this message as emails are not monitored at this account.
******************************
https://github.com/mdmzfzl/NeetCode-Solutions
def solve_problem(inputs):
# Step 1: Understand the Problem
# - Parse inputs and outputs.
# - Clarify constraints (e.g., time, space).
# - Identify edge cases.
# Step 2: Plan and Design
# - Think about the brute-force approach.
# - Optimize: Can you use dynamic programming, divide & conquer, etc.?
# - Choose the appropriate data structures (e.g., arrays, hashmaps, heaps).
# Step 3: Implement the Solution
# - Use helper functions for modularity.
# - Write clear, well-commented code.
def helper_function(args):
# Optional: For recursion, BFS, DFS, etc.
pass
# Main logic
result = None # Initialize result or output variable.
# Example logic
# for num in inputs:
# result += num # Or other computations.
return result
# Driver code (for testing locally)
if __name__ == "__main__":
inputs = [] # Replace with example test cases.
print(solve_problem(inputs))