ARCAI Research Lab

Lab Facilities

Explore the hardware, spaces, and infrastructure powering our research in autonomous systems, human-robot interaction, and next-generation AI.

0+Facilities & Platforms
0Indoor Arena
0+Active Researchers
0×A100GPU Cluster

Featured Facilities

Operational

Main Robotics Arena

600 m² indoor test environment

A fully equipped indoor arena designed for autonomous robot navigation, multi-agent coordination, an

Operational

Outdoor Field Site

2-acre outdoor test field

Open-air facility for testing terrain-adaptive robots, aerial vehicles, and long-range autonomous sy

Operational

OptiTrack Motion Capture

12-camera sub-millimeter tracking

State-of-the-art optical motion capture system covering the full arena. Provides ground-truth 6-DOF

Operational

GPU Compute Cluster

8× NVIDIA A100 80GB

High-performance GPU cluster dedicated to deep learning training, reinforcement learning simulation,

19 facilities

Operational

Main Robotics Arena

600 m² indoor test environment

A fully equipped indoor arena designed for autonomous robot navigation, multi-agent coordination, and human-robot interaction experiments. Features configurable obstacle courses and real-time overhead tracking.

Area600 m²
Ceiling Height5.5 m
Floor TypeAnti-static composite
LightingAdjustable LED array
Operational

Outdoor Field Site

2-acre outdoor test field

Open-air facility for testing terrain-adaptive robots, aerial vehicles, and long-range autonomous systems under real environmental conditions. Includes GPS reference stations.

Area2 acres
Surface TypesGrass, gravel, asphalt
GPS ReferenceRTK base station
Network5G + LoRa mesh
Operational×2

Clearpath Husky A200

All-terrain UGV platform

Rugged unmanned ground vehicle for outdoor autonomy research. Equipped with a 360° LiDAR, stereo camera, and onboard NVIDIA Jetson AGX Orin for edge inference.

Payload75 kg
Speed1.0 m/s
Endurance3 hrs
Count2 units
Operational×6

TurtleBot 4

Indoor mobile research platform

ROS2-native differential drive robots ideal for SLAM, path planning, and swarm robotics experiments. Pre-loaded with Nav2 stack and custom perception pipelines.

Diameter342 mm
Max Speed0.46 m/s
SensorsRPLIDAR, OAK-D
Count6 units
New×15

Custom Swarm Robots

Micro-robot swarm platform

In-house developed 15-agent swarm testbed for decentralized coordination, emergent behavior, and bio-inspired algorithms. Each unit under 200g with IR communication.

Fleet Size15 agents
Weight< 200 g
Comm.IR + WiFi 6
Endurance90 min
Operational×2

DJI Matrice 300 RTK

Enterprise UAV platform

Professional-grade quadrotor with centimeter-level positioning accuracy. Used for aerial mapping, infrastructure inspection, and UAV-UGV collaborative experiments.

Flight Time55 min
Payload2.7 kg
PositioningRTK + D-RTK 2
Wind Resistance15 m/s
Operational×8

Custom Quadrotors

Research-grade micro UAVs

Lightweight agile platforms built on Betaflight firmware with onboard compute for vision-based autonomy and indoor swarm flight. Fully customizable hardware stack.

Size250 mm frame
ComputeRaspberry Pi CM4
CameraGlobal shutter 720p
Count8 units
Operational×2

Universal Robots UR10e

6-DOF collaborative robot arm

High-precision collaborative arms for manipulation research, assembly automation, and human-robot collaboration experiments. Integrated force-torque sensing.

Reach1300 mm
Payload12.5 kg
Repeatability±0.05 mm
Count2 units
Operational

OptiTrack Motion Capture

12-camera sub-millimeter tracking

State-of-the-art optical motion capture system covering the full arena. Provides ground-truth 6-DOF pose at 240 Hz, essential for algorithm benchmarking.

Cameras12 × Prime 41
Accuracy< 0.3 mm
Frequency240 Hz
CoverageFull arena
Operational×4

Velodyne VLP-32C LiDAR

32-channel 3D LiDAR array

High-density point cloud sensors for 3D mapping, object detection, and SLAM research. Paired with calibrated camera rigs for sensor fusion studies.

Channels32
Range200 m
Accuracy±3 cm
Count4 units
Operational×10

Intel RealSense D435i

RGB-D stereo depth cameras

Compact depth cameras used across mobile and manipulation platforms. Ideal for close-range 3D reconstruction, object pose estimation, and dense SLAM.

Depth Range0.2 – 10 m
Resolution1280 × 720 @ 90fps
IMU6-DOF integrated
Count10 units
Operational

GPU Compute Cluster

8× NVIDIA A100 80GB

High-performance GPU cluster dedicated to deep learning training, reinforcement learning simulation, and large-scale data processing for robotics research.

GPUs8× NVIDIA A100 80GB
CPUs2× AMD EPYC 7742
RAM2 TB DDR4 ECC
Storage200 TB NVMe RAID
Operational×12

Edge Computing Nodes

NVIDIA Jetson fleet

Distributed edge inference nodes mounted on mobile platforms and fixed locations across the lab. Enables real-time perception pipelines without cloud dependency.

PlatformJetson AGX Orin
AI TOPS275 TOPS each
Network10 GbE + WiFi 6E
Count12 nodes
New×2

High-Speed Camera Array

Up to 10,000 fps capture

Photron FASTCAM high-speed cameras for analyzing fast robot dynamics, impact events, and contact mechanics at millisecond resolution.

Max FPS10,000 fps
Resolution1024 × 1024
Count2 units
TriggerHardware sync
Operational

NVIDIA Isaac Sim

Photorealistic robot simulation

GPU-accelerated simulation environment for training and testing autonomous systems with sim-to-real transfer. Full ROS2 bridge, physics engine, and synthetic data generation.

PhysicsPhysX 5
RenderingRTX path tracing
ROS2Humble / Iron
Licenses5 concurrent
Operational

Wireless Testbed

Multi-protocol RF lab

Dedicated RF environment with software-defined radios, 5G NR testbed, and LoRa mesh infrastructure. Enables research in robot communication protocols and networked autonomy.

SDRUSRP B210 × 6
Protocols5G NR, WiFi 6E, LoRa
Frequency70 MHz – 6 GHz
ShieldingRF-isolated room
New

Bio-Inspired Robotics Bench

Soft robotics & bio-mechatronics

Experimental workbench for soft robotics, compliant mechanisms, and bio-inspired locomotion research. Equipped with material testing rigs and 3D printing facilities.

3D PrintersMarkforged X7, Bambu X1C
ActuatorsSMA, pneumatic, DEA
TestingForce/torque rig
Workspace2 benches
Operational

Human-Robot Interaction Lab

Instrumented HRI research space

Dedicated space for studying human-robot collaboration, trust, and ergonomics. Features biometric sensors, eye-tracking, EEG headsets, and configurable social robot setups.

Area80 m²
Eye TrackingTobii Pro Glasses 3
EEGg.tec UNICORN
Social RobotNAO v6, Pepper
Operational

MATLAB / Simulink

Control & algorithm prototyping

Full academic license suite covering Robotics System Toolbox, ROS Toolbox, Deep Learning, Reinforcement Learning, and Optimization toolboxes for rapid prototyping.

Toolboxes40+ licensed
VersionR2024b
Licenses20 concurrent
HPC IntegrationParallel Computing TB

Interested in Lab Access?

We welcome collaborations with industry partners and visiting researchers. Reach out to discuss facility access, joint projects, or student opportunities.

Get in Touch