Template
%%
%% This is file `sample-sigconf.tex',
%% generated with the docstrip utility.
%%
%% The original source files were:
%%
%% samples.dtx (with options: `all,proceedings,bibtex,sigconf')
%%
%% IMPORTANT NOTICE:
%%
%% For the copyright see the source file.
%%
%% Any modified versions of this file must be renamed
%% with new filenames distinct from sample-sigconf.tex.
%%
%% For distribution of the original source see the terms
%% for copying and modification in the file samples.dtx.
%%
%% This generated file may be distributed as long as the
%% original source files, as listed above, are part of the
%% same distribution. (The sources need not necessarily be
%% in the same archive or directory.)
%%
%%
%% Commands for TeXCount
%TC:macro \cite [option:text,text]
%TC:macro \citep [option:text,text]
%TC:macro \citet [option:text,text]
%TC:envir table 0 1
%TC:envir table* 0 1
%TC:envir tabular [ignore] word
%TC:envir displaymath 0 word
%TC:envir math 0 word
%TC:envir comment 0 0
%%
%% The first command in your LaTeX source must be the \documentclass
%% command.
%%
%% For submission and review of your manuscript please change the
%% command to \documentclass[manuscript, screen, review]{acmart}.
%%
%% When submitting camera ready or to TAPS, please change the command
%% to \documentclass[sigconf]{acmart} or whichever template is required
%% for your publication.
%%
%%
\documentclass[sigconf]{acmart}
%%
%% \BibTeX command to typeset BibTeX logo in the docs
\AtBeginDocument{%
\providecommand\BibTeX{{%
Bib\TeX}}}
%% Rights management information. This information is sent to you
%% when you complete the rights form. These commands have SAMPLE
%% values in them; it is your responsibility as an author to replace
%% the commands and values with those provided to you when you
%% complete the rights form.
\setcopyright{acmlicensed}
\copyrightyear{2018}
\acmYear{2018}
\acmDOI{XXXXXXX.XXXXXXX}
%% These commands are for a PROCEEDINGS abstract or paper.
\acmConference[Conference acronym 'XX]{Make sure to enter the correct
conference title from your rights confirmation email}{June 03--05,
2018}{Woodstock, NY}
%%
%% Uncomment \acmBooktitle if the title of the proceedings is different
%% from ``Proceedings of ...''!
%%
%%\acmBooktitle{Woodstock '18: ACM Symposium on Neural Gaze Detection,
%% June 03--05, 2018, Woodstock, NY}
\acmISBN{978-1-4503-XXXX-X/2018/06}
%%
%% Submission ID.
%% Use this when submitting an article to a sponsored event. You'll
%% receive a unique submission ID from the organizers
%% of the event, and this ID should be used as the parameter to this command.
%%\acmSubmissionID{123-A56-BU3}
%%
%% For managing citations, it is recommended to use bibliography
%% files in BibTeX format.
%%
%% You can then either use BibTeX with the ACM-Reference-Format style,
%% or BibLaTeX with the acmnumeric or acmauthoryear sytles, that include
%% support for advanced citation of software artefact from the
%% biblatex-software package, also separately available on CTAN.
%%
%% Look at the sample-*-biblatex.tex files for templates showcasing
%% the biblatex styles.
%%
%%
%% The majority of ACM publications use numbered citations and
%% references. The command \citestyle{authoryear} switches to the
%% "author year" style.
%%
%% If you are preparing content for an event
%% sponsored by ACM SIGGRAPH, you must use the "author year" style of
%% citations and references.
%% Uncommenting
%% the next command will enable that style.
%%\citestyle{acmauthoryear}
%%
%% end of the preamble, start of the body of the document source.
\begin{document}
%%
%% The "title" command has an optional parameter,
%% allowing the author to define a "short title" to be used in page headers.
\title{ScavengeAR: From Licensing Fees to Free — Rebuilding Mobile AR with Unity-Native Tools}
%%
%% The "author" command and its associated commands are used to define
%% the authors and their affiliations.
%% Of note is the shared affiliation of the first two authors, and the
%% "authornote" and "authornotemark" commands
%% used to denote shared contribution to the research.
\author{Victor Leung}
%%\email{thevictor2225@gmail.com}
\orcid{0009-0000-0600-668X}
%%\affiliation{%
%% \institution{Independent Developer}
%% \city{California}
%% \state{Mountain View}
%% \country{USA}
}
%%
%% The abstract is a short summary of the work to be presented in the
%% article.
\begin{abstract}
ScavengeAR, a conference-scale AR creature-collecting game first, returns in 2025 with a fully modernized tech stack and refined core gameplay — preserving the original player experience while eliminating costly third-party dependencies.
\end{abstract}
%% A "teaser" image appears between the author and affiliation
%% information and the body of the document, and typically spans the
%% page.
\begin{teaserfigure}
\includegraphics[width=\textwidth]{scavengearbanner}
\caption{Seattle Mariners at Spring Training, 2010.}
\Description{Enjoying the baseball game from the third-base
seats. Ichiro Suzuki preparing to bat.}
\label{fig:teaser}
\end{teaserfigure}
\received{20 February 2007}
\received[revised]{12 March 2009}
\received[accepted]{5 June 2009}
%%
%% This command processes the author and affiliation and title
%% information and builds the first part of the formatted document.
\maketitle
\section{Introduction}
ScavengeAR was SIGGRAPH's official AR mobile app from 2017 to 2019, offering attendees an engaging Augmented Reality experience and providing Indiana University students with hands-on opportunities in interactive media development.
In the game, SIGGRAPH attendees choose a role — Artist, Scientist, or Educator — and explore the conference venue to discover printed image targets that, when scanned through the app, spawn 3D AR creatures into their physical surroundings. Like a digital safari, players use the in-app camera to photograph each creature, capturing them into a personal collection. New creatures are released each day of the conference, encouraging daily exploration and unlocking new layers of narrative and prize opportunities over time.
\section{Motivation}
ScavengeAR ran from 2017-2019, with over a thousand daily active users during the conference. In 2020, the Covid Pandemic stopped in-person activity for a number of years, and effectively halted the app's development, leaving it in a deprecated state.
In 2025, a small volunteer team undertook a comprehensive refactoring of ScavengeAR to improve its maintainability and accessibility, in order to revive the spirit of the app to a modern audience.
\section{Technical Approach}
The original version of ScavengeAR was built using Unity 2018 and Vuforia 8 - a robust and accessible image tracking solution that was among the most practical options for AR based on 2D markers at the time. Vuforia’s artist-friendly pipeline utilized a parent-child GameObject hierarchy within a single-scene architecture, allowing low-code AR development and rapid prototyping in a pre-ARKit/ARCore standardization era. However, Vuforia also required licensing fees for production deployment and cloud-hosted image target storage. In addition, Vuforia’s roadmap has shifted towards 3D object and model tracking.
For the 2025 relaunch, ScavengeAR was reengineered using Unity 2023.2 and AR Foundation 5.1 — Unity’s native, cross-platform AR framework built on top of ARKit and ARCore — to support lightweight, offline 2D image tracking and reduce long-term costs, build size, and technical dependencies.
Migrating to AR Foundation required an architectural shift — from Vuforia’s monolithic, parent-child single-scene model to AR Foundation’s modular, multi-scene prefab-driven design. In exchange for Vuforia’s ease of use, the new system was capable of better performance management and clearer separation of concerns. AR Foundation also supports full in-editor XR simulation (replacing webcam-based editor workflows) and more performant URP integration instead of Built-in Renderer. This was crucial in iterating quickly to modernize the photo capture gameplay experience to modern day.
Originally built using GameSparks for backend services and a licensed data binding plugin for UI management, we replaced GameSparks with Unity's built-in serialization for offline data handling, and transitioned to an open-source MVVM Toolkit. These changes not only reduced external dependencies and licensing costs but also simplified the codebase, making it more approachable for new contributors.
While the core functionality remains consistent with previous iterations, this overhaul ensures that ScavengeAR can continue to be a sustainable and educational project. By leveraging Unity's modern XR capabilities and open-source tools, we've positioned the app for easier updates and potential future enhancements, aligning with our goal of providing valuable learning experiences for students and enjoyable interactions for conference attendees.
\section{Art and Design}
Indiana University Bloomington has historically created new 3D creatures for each iteration of Scavenge AR, and they have returned to do so in 2025. We proudly proclaim that our 3D creatures have been created using the traditional animation visual effects pipeline. However, we did use OpenAI chatgpt for occasional critiques and image-to-image refinements.
As for Design, we focused on improving the camera capture taking experience. We are able to add shutter and polaroid fade in animation and modernize the camera GUI to match modern phone interactions as well as enhance our analog camera motif.
The use of \BibTeX\ for the preparation and formatting of one's
references is strongly recommended. Authors' names should be complete
--- use full first names (``Donald E. Knuth'') not initials
(``D. E. Knuth'') --- and the salient identifying features of a
reference should be included: title, year, volume, number, pages,
article DOI, etc.
The bibliography is included in your source document with these two
commands, placed just before the \verb|\end{document}| command:
\begin{verbatim}
\bibliographystyle{ACM-Reference-Format}
\bibliography{bibfile}
\end{verbatim}
where ``\verb|bibfile|'' is the name, without the ``\verb|.bib|''
suffix, of the \BibTeX\ file.
Citations and references are numbered by default. A small number of
ACM publications have citations and references formatted in the
``author year'' style; for these exceptions, please include this
command in the {\bfseries preamble} (before the command
``\verb|\begin{document}|'') of your \LaTeX\ source:
\begin{verbatim}
\citestyle{acmauthoryear}
\end{verbatim}
Some examples. A paginated journal article \cite{Abril07}, an
enumerated journal article \cite{Cohen07}, a reference to an entire
issue \cite{JCohen96}, a monograph (whole book) \cite{Kosiur01}, a
monograph/whole book in a series (see 2a in spec. document)
\cite{Harel79}, a divisible-book such as an anthology or compilation
\section{Future Work}
We hope to revisit the Generative AI pipeline to build. This will open to new avenue s
At the moment, we are not using the cloud, but hope to add it back it in at a later date.
Though we revived the core gameplay loop, we removed photo mode and interactive ar tutorial.
\begin{acks}
To Robert, for the bagels and explaining CMYK and color spaces.
\end{acks}
%%
%% The next two lines define the bibliography style to be used, and
%% the bibliography file.
\bibliographystyle{ACM-Reference-Format}
\bibliography{sample-base}
%%
%% If your work has an appendix, this is the place to put it.
\appendix
\end{document}
\endinput
%%
%% End of file `sample-sigconf.tex'.
which is to be published in siggraphappyhour25 to The ACM Publishing System (TAPS). The system will be sending 3 reminders within 120 HRS of sending the first notification to upload inputs of your paper. Please upload your files within this time frame, to avoid support delays due to last minute rush.
Please use this link to upload the zip file of your paper for processing, the instructions regarding how to prepare your zip file for submission are posted at the top of the page:
http://camps.aptaracorp.com/
Note: Please make sure to verify your paper details from the option “CHECK PAPER DETAILS” provided at the mentioned link, before you upload the zip file, as that information will be printed in your paper?s HTML version.
In addition, any supplements/videos associated with your paper will need to be uploaded using a separate link mentioned below:
https://cms.acm.org/
Should you have any issues preparing or uploading the zip file for your paper, please lodge a ticket using “Contact Support” form on your TAPS dashboard; If you face any problem while adding the ticket, then you can also contact confsupport@aptaracorp.com for assistance.
Sincerely,
ACM Production
******************************
Do not reply to this message as emails are not monitored at this account.
******************************
https://github.com/mdmzfzl/NeetCode-Solutions
def solve_problem(inputs):
# Step 1: Understand the Problem
# - Parse inputs and outputs.
# - Clarify constraints (e.g., time, space).
# - Identify edge cases.
# Step 2: Plan and Design
# - Think about the brute-force approach.
# - Optimize: Can you use dynamic programming, divide & conquer, etc.?
# - Choose the appropriate data structures (e.g., arrays, hashmaps, heaps).
# Step 3: Implement the Solution
# - Use helper functions for modularity.
# - Write clear, well-commented code.
def helper_function(args):
# Optional: For recursion, BFS, DFS, etc.
pass
# Main logic
result = None # Initialize result or output variable.
# Example logic
# for num in inputs:
# result += num # Or other computations.
return result
# Driver code (for testing locally)
if __name__ == "__main__":
inputs = [] # Replace with example test cases.
print(solve_problem(inputs))