Unity for AR/VR Mobile App Development

·14 min read·Mobile Developmentintermediate

Why Unity remains the practical choice for immersive mobile experiences in 2025.

A developer’s desk with a laptop running the Unity Editor, a smartphone mounted in a simple VR headset, and a coffee mug, illustrating the mobile AR/VR development setup.

I started my first Unity mobile AR project five years ago, using a borrowed iPhone 8 and a slightly-too-warm laptop. The concept was simple: overlay basic furniture in a living room to help with a remodel. What wasn’t simple was figuring out which AR framework to use, how to keep the frame rate stable on a mobile device, and how to debug an issue that only appeared when the phone was held upside down in low light. That project shipped, barely, but it taught me a critical lesson: for mobile AR and VR, the engine you choose matters less than the workflow you build around it. Unity is not perfect, but it is currently the most balanced tool for bridging the gap between creative design and mobile hardware constraints.

In this post, I’ll walk you through the real-world landscape of building AR and VR apps for mobile using Unity. We won’t just list features; we’ll look at how decisions made in the Editor affect performance on a phone, how to structure a project for maintainability, and where Unity fits compared to alternatives like Unreal or native SDKs. We’ll examine common pitfalls, such as ignoring the render scale on VR devices or mishandling AR session states, and I’ll share a few hard-won insights from my own time in the trenches. If you’re a developer or a technically curious reader wondering whether Unity is the right foundation for your next immersive mobile project, this should give you a grounded perspective.

Context: Where Unity Fits in the Mobile AR/VR Ecosystem Today

Unity is a cross-platform game engine, but over the last decade, it has evolved into a general-purpose tool for interactive 3D content, including AR and VR. On mobile, it sits at the center of a fragmented ecosystem. Apple’s ARKit and Google’s ARCore provide the foundational tracking and environmental understanding, but they are native SDKs. Unity acts as the bridge, offering a unified API via its XR Plugin system and the AR Foundation package. This means you can write a single codebase that targets both iOS and Android for many AR features, though platform-specific quirks always remain.

Who uses Unity for mobile AR/VR? Indie developers, small studios, and even large enterprises building training simulators or retail visualization tools. In my experience, teams choose Unity because it lowers the barrier to entry for 3D interaction. A designer can prototype an AR scene without touching native code, and a programmer can later optimize it for performance. Compared to native development (Swift for iOS ARKit, Kotlin for Android ARCore), Unity saves time upfront but requires careful optimization to match native performance. Compared to Unreal Engine, Unity generally has a lighter footprint on mobile, though Unreal’s high-fidelity graphics can be tempting for VR experiences on powerful headsets. For mobile AR, where battery life and thermal throttling are real concerns, Unity’s flexibility often wins out.

Real-world usage often looks like this: A retail app uses AR to let users place products in their homes. The core logic is in C# scripts within Unity, handling AR plane detection, object placement, and simple UI. The heavy lifting—rendering, physics—is delegated to the engine, but developers must still profile the app on actual devices, not just the Editor. I’ve seen projects where everything ran smoothly in the Unity Editor but crashed on a mid-range Android phone due to unoptimized texture sizes. That’s the kind of gap Unity helps you close, but only if you respect the platform constraints.

Technical Core: Concepts, Capabilities, and Practical Examples

AR Development with Unity and AR Foundation

AR Foundation is Unity’s abstraction layer over ARKit and ARCore. It handles plane detection, image tracking, and environment probing, among other features. The key mental model here is that AR Foundation provides components you add to a GameObject, and you write scripts to respond to events like “a plane was detected” or “an image was tracked.”

A typical AR scene in Unity includes:

  • An AR Session (manages the lifecycle of the AR experience).
  • An AR Session Origin (the root for camera and tracked objects).
  • AR components like AR Plane Manager and AR Raycast Manager.

Here’s a simple example: placing a 3D object on a detected plane when the user taps the screen. This is a common pattern in retail or educational apps.

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;

public class ARObjectPlacer : MonoBehaviour
{
    [SerializeField] private GameObject prefabToPlace;
    private ARRaycastManager raycastManager;
    private ARPlaneManager planeManager;
    private List<ARRaycastHit> hits = new List<ARRaycastHit>();

    void Awake()
    {
        raycastManager = GetComponent<ARRaycastManager>();
        planeManager = GetComponent<ARPlaneManager>();
    }

    void Update()
    {
        if (Input.touchCount > 0 && Input.GetTouch(0).phase == TouchPhase.Began)
        {
            Vector2 touchPosition = Input.GetTouch(0).position;

            if (raycastManager.Raycast(touchPosition, hits, TrackableType.PlaneWithinPolygon))
            {
                Pose hitPose = hits[0].pose;
                Instantiate(prefabToPlace, hitPose.position, hitPose.rotation);
            }
        }
    }
}

In this script, we use ARRaycastManager to detect a plane at the touch position and instantiate a prefab. In a real project, you’d add validation: check if the plane is large enough, disable placement on vertical planes for certain objects, and maybe add haptic feedback. I’ve used this pattern in an app for visualizing HVAC units in commercial spaces. The trick was tuning the raycast to ignore small, noisy planes that ARCore sometimes generates near edges.

A common issue: AR Foundation can be sensitive to device compatibility. Not all Android phones support ARCore, and iOS devices have varying AR capabilities. You should check SystemInfo.supportsAR before starting the session and provide a fallback, like a simple 3D viewer without AR.

VR Development with Unity and OpenXR

For VR, Unity supports OpenXR, which is becoming the standard across headsets like Meta Quest and HTC Vive Focus. On mobile, the primary target is standalone headsets (e.g., Meta Quest 2/3) or mobile VR using Google’s Cardboard or Oculus Mobile SDK. Unity’s XR Interaction Toolkit provides components for handling controllers, hand tracking, and locomotion.

A basic VR setup involves:

  • An XR Rig (the player’s camera and controllers).
  • Interaction components like XR Ray Interactor for grabbing objects.
  • Locomotion systems (teleportation or continuous movement).

Consider a simple VR interaction where the user grabs an object with a controller. This is foundational for games or training simulations.

using UnityEngine;
using UnityEngine.XR.Interaction.Toolkit;

public class VRGrabExample : MonoBehaviour
{
    [SerializeField] private XRGrabInteractable grabInteractable;

    void Start()
    {
        // Enable grab interactions
        grabInteractable.selectEntered.AddListener(OnGrab);
        grabInteractable.selectExited.AddListener(OnRelease);
    }

    private void OnGrab(SelectEnterEventArgs args)
    {
        Debug.Log("Object grabbed by " + args.interactorObject.transform.name);
        // Add haptic feedback or sound here
    }

    private void OnRelease(SelectExitEventArgs args)
    {
        Debug.Log("Object released");
        // Optional: Apply physics forces on release
    }
}

In this example, XRGrabInteractable handles the grab logic. You attach this script to a GameObject with a collider and a rigidbody. In practice, for mobile VR, you must optimize for performance: keep draw calls low, use simple shaders, and avoid real-time shadows. I once worked on a VR training module for field technicians. The initial build dropped frames on a Quest 2 because of high-poly models. Reducing mesh complexity and baking lighting solved it, but it required constant profiling with Unity’s Frame Debugger.

For mobile VR, OpenXR reduces fragmentation, but you still need to account for controller differences (e.g., Oculus Touch vs. Vive wands). Unity’s Input System package helps abstract this, but testing on actual hardware is non-negotiable. Emulators can’t replicate motion sickness or tracking latency.

Workflow and Project Structure

A clean project structure prevents chaos as the app grows. Here’s a typical layout for a mobile AR/VR project:

Assets/
├── _Scenes/
│   ├── AR_Main.unity
│   ├── VR_Main.unity
│   └── Menu.unity
├── _Scripts/
│   ├── AR/
│   │   ├── ARSessionManager.cs
│   │   └── ObjectPlacer.cs
│   ├── VR/
│   │   ├── VRInteraction.cs
│   │   └── LocomotionController.cs
│   └── Common/
│       ├── UIManager.cs
│       └── AudioManager.cs
├── _Prefabs/
│   ├── AR/
│   │   └── PlacementPrefab.prefab
│   └── VR/
│       └── GrabObject.prefab
├── _Materials/
│   ├── MobileOpaque.mat
│   └── UIMaterial.mat
├── _Textures/
│   ├── iOS/
│   └── Android/
└── _Plugins/
    └── Android/
        └── manifest.gradle

This structure separates concerns: scenes for different modes, scripts by feature, and assets by platform. Use Unity’s Addressable Asset System for dynamic loading to keep the initial build size small. For configuration, I often use ScriptableObjects to store settings like AR session parameters or VR movement speed. This keeps data out of code and allows designers to tweak values without recompiling.

A key workflow tip: always build and test on target devices early. Unity’s Build Settings let you switch platforms, but the process isn’t instantaneous. I set up a CI/CD pipeline using Unity Cloud Build to automate iOS and Android builds, which saves hours of manual work.

Here’s a simple ScriptableObject for AR configuration:

using UnityEngine;

[CreateAssetMenu(fileName = "ARConfig", menuName = "AR/Configuration")]
public class ARConfig : ScriptableObject
{
    [Header("Plane Detection")]
    public bool detectHorizontalPlanes = true;
    public bool detectVerticalPlanes = false;

    [Header("Session Settings")]
    public float planeDetectionInterval = 1.0f; // Seconds between plane updates
}

You reference this in your AR session manager:

public class ARSessionManager : MonoBehaviour
{
    [SerializeField] private ARConfig config;
    private ARPlaneManager planeManager;

    void Start()
    {
        planeManager = GetComponent<ARPlaneManager>();
        planeManager.requestedDetectionMode = config.detectHorizontalPlanes 
            ? PlaneDetectionMode.Horizontal 
            : PlaneDetectionMode.None;
        // Additional setup based on config...
    }
}

This approach made it easy to switch between “furniture placement” and “wall art preview” modes in a retail app I consulted on. Designers could create new config assets without touching code.

Honest Evaluation: Strengths, Weaknesses, and Tradeoffs

Unity shines in rapid prototyping and cross-platform deployment. For AR/VR on mobile, its asset store is a goldmine—plugins for AR Foundation, 3D models, and tools like Photon for multiplayer can jumpstart development. The editor’s visual workflow allows non-programmers to contribute, which is invaluable in small teams. Performance-wise, Unity is decent for mobile if you follow best practices: use the Universal Render Pipeline (URP) for lightweight rendering, avoid unnecessary physics, and profile with the Unity Profiler.

However, Unity isn’t a silver bullet. The learning curve for advanced features like custom shaders or burst-compiled jobs can be steep. On older mobile devices, Unity apps can feel bloated if not optimized; I’ve seen AR apps crash due to memory leaks from unmanaged textures. Compared to native ARKit, Unity’s AR Foundation may lag behind the latest iOS features (e.g., LiDAR scanning on newer iPhones). For VR, Unity is strong for standalone headsets but less ideal for mobile VR without headsets, where WebXR might be a lighter alternative.

Tradeoffs include:

  • Time vs. Performance: Unity speeds up development but requires more optimization effort for smooth mobile performance.
  • Flexibility vs. Complexity: You can build almost anything, but overusing assets or plugins can lead to compatibility issues.
  • Cost: Unity is free for personal use, but commercial projects need a subscription if revenue exceeds $100k/year, and asset store purchases add up.

In scenarios where you need pixel-perfect AR with minimal overhead (e.g., a simple marker-based AR for events), native SDKs might be faster. For VR games targeting high-end headsets, Unreal’s graphics could be better, but for mobile AR/VR, Unity’s balance keeps it relevant.

Personal Experience: Lessons from the Trenches

I’ve built three production AR/VR apps with Unity for mobile, and each taught me something new. My first AR project taught me the importance of light estimation; without it, virtual objects looked out of place in dim rooms, and users complained. Unity’s AR Foundation has light estimation built in, but I initially disabled it to save performance—big mistake. Now, I always enable it for AR scenes.

Common mistakes I’ve made or seen:

  • Ignoring platform differences: Android’s permission handling for camera is stricter than iOS. Always use Unity’s AndroidJavaClass for runtime permissions in AR apps.
  • Over-relying on the Editor: Physics behave differently on mobile. Test grasping in VR on a real headset to catch jitter.
  • Version mismatches: Upgrading Unity or AR Foundation without updating XR Plug-in Management broke a build once. Pin your versions in the manifest.

A moment when Unity proved valuable was during a VR onboarding app for a medical device. We needed to simulate hand interactions without controllers. Unity’s XR Interaction Toolkit with hand tracking integration saved us weeks of native coding. The result was a stable app that ran on Quest 2 and Pico headsets with minimal changes.

The learning curve isn’t linear. Start with simple scenes, then layer in complexity. I recommend pairing Unity with version control (Git) and using Unity Collaborate or Plastic SCM for team projects—avoid the chaos of manual file sharing.

Getting Started: Setup and Mental Models

To begin, download Unity Hub and install a LTS version (e.g., 2022.3 LTS for stability). Create a new 3D project (URP template for mobile efficiency). Then, via the Package Manager, install:

  • AR Foundation (for AR).
  • XR Interaction Toolkit (for VR).
  • XR Plugin Management (to enable ARCore/ARKit or OpenXR).

For AR on Android:

  1. Enable ARCore in Player Settings (under XR Plug-in Management).
  2. Set minimum API level to 24 (Android 7.0) and target API to latest.
  3. Add camera permission in the Android manifest (via Assets/Plugins/Android/AndroidManifest.xml).

For VR on Quest (Android-based):

  1. Install Oculus Integration from the Asset Store (or use OpenXR).
  2. Set Graphics API to Vulkan for better performance.
  3. Configure Build Settings for Android, with Oculus as the target.

Folder structure as outlined earlier—keep it clean to avoid refactoring later. The mental model: Unity is a layer over native platforms. Your scripts are C#, but they call into native plugins for AR/VR features. Profile early: use Unity’s Profiler for CPU/GPU, and Xcode/Android Studio for deep dives on devices.

What makes Unity stand out is its ecosystem: the Asset Store for quick assets, Unity Cloud for collaboration, and a vast community for troubleshooting. For maintainability, use Unity’s Package Manager to manage dependencies and avoid manual DLLs. Developer experience is smooth for iteration—hot reload in the Editor (though limited), but always compile for device testing.

Free Learning Resources

Here are curated, practical resources to get started:

  • Unity Learn: AR Foundation Tutorials (learn.unity.com): Free interactive courses on AR setup and best practices. Why useful? Hands-on projects with downloadable assets.
  • Unity Manual: XR Interaction Toolkit (docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@2.0/manual): Official docs with examples. Why useful? It’s the source of truth for VR interactions, with code snippets you can adapt.
  • Google ARCore Documentation (developers.google.com/ar/develop): Covers AR Foundation integration for Android. Why useful? Explains device compatibility and optimization tips.
  • Apple ARKit Documentation (developer.apple.com/augmented-reality): For iOS-specific AR features. Why useful? Helps understand platform nuances in Unity’s AR Foundation.
  • YouTube: Valem’s Unity VR Tutorials (channel: Valem): Free videos on VR development with Unity. Why useful? Real-world project walkthroughs, including performance tuning.

These resources are grounded in official or community-verified content, avoiding fluff. Start with Unity Learn for a structured path.

Conclusion: Who Should Use Unity for Mobile AR/VR?

Unity is ideal for developers and small teams building cross-platform AR/VR mobile apps who value speed and flexibility over absolute native performance. It’s perfect for prototypes, retail visualizations, educational tools, and lightweight games. If you’re coming from a web or general programming background, its C# scripting feels approachable, and the visual editor lowers the 3D barrier.

Skip it if you’re targeting cutting-edge native AR features without compromise (go Swift/Kotlin) or building graphically intense VR for high-end PCs (consider Unreal). For mobile AR/VR, Unity’s tradeoffs—optimization effort for broad reach—make it a solid choice in 2025. The takeaway: Unity won’t make you an expert overnight, but with disciplined workflow and device testing, it can turn immersive ideas into shippable apps. If you’ve built with it, share your stories; the community thrives on those real-world lessons.

*** here is placeholder: query = ar foundation scene *** *** alt text for image = Screenshot of a Unity Editor scene with AR Foundation components, showing an AR Session Origin and a simple 3D cube placed on a detected plane in a test environment. ***