Try Clicking and Dragging Mouse on Screen!!
Scroll Down
I recently got my PhD from UNR in CS. I really enjoy building new things, spending time in XR, meeting with new people and learning new things. I spend my weekends hiking or in hackathons.
My current research interest is with building a set of tools for content developers that help them assess and improve the accessibility of their contents. My postdoctoral fellowship at SKERI will primarily involve developing A11yOS (read Accessibility Operating System) under the mentorship of Dr. James Coughlan. I am looking to collaborate with color vision experts and occupational therapists. nasif.zaman@ski.org if want to learn more and chat!
Individuals with comparable visual test results and diagnoses may have very different residual visual capacities and corresponding experiences. The ability to deliver personalized visual feedback tailored to each user's contextual needs is made possible by the considerable computational and rendering capabilities of modern extended reality (XR) devices. A11yOS, a near-eye head-mounted video pass-through display (HMVPD) framework will consist three modules that jointly help developers test the accessibility of their new assistive product with a diverse set of simulated perceptual loss and help target users identify the best settings for their usage. My goal is to open the framework for public contribution once it has been validated on color deficient and central field loss subjects.
Calibrating pass-through head-mounted systems to quantify and augment residual vision of impaired individuals: I am working on three modules whereby a content developer or consumer can increase the accessibility of their product by assessing the residual visual ability of the intended user and find the optimal enhancements for each context.
#SwiftUI
#VisionOS
Detecting scotoma location and type through tracing: Low vision ophthalmologists often use tracing tasks to diagnose scotoma location. We are collecting tracing data to train ML model to predict scotoma features.
#UIKit
#PencilKit
Hackathon template for accessibility-focused projects: A11yFig is a template for hackathons that focuses on accessibility in wearables with cameras.
#Meta Quest 3
#Discord
#Meta Ray-Ban
Smith-Kettlewell Eye Research Institute
EyeSightQuest LLC
University of Nevada, Reno
Rokkhi.com
Bangladesh University of Engineering and Technology
Elab.ai
My previous research involved diagnosing ocular diseases such as Spaceflight Associated Neuro-ocular Syndrome (SANS) with novel VR-based tests. Astronauts during long duration spaceflight undergo some ocular structural and functional changes and my research focused on creating a portable system to detect such changes reliably. Most of the following projects are focused on visual assessments and simulations. My dissertation defense video is here:
Dissertation Video
To see some of my selected publications, click on the years below:
To see more of my publications, visit my Google Scholar Page:
Publications
Open source projects related to my publications:
Developed novel multi-pass rendering pipeline with selective color correction head-mounted displays for controlled vision research.
#Unreal Engine
#Post-Processing
#Color Science
#MATLAB
#Spectroradiometer
#Calibration
Developed novel psychophysical tests as well as clinical visual function tests for disease diagnostics and rehabilitation using video see-through head-mounted displays.
#Unreal Engine
#Psychopy
#SRanipal
#SRWorks
#PicoXR
Developed novel graphics pipeline by modifying the game engine in Unreal Engine 5.3 to shift 2D or 3D VR view towards blur or sharp in real time using shader-based FFT. Also worked with procedural content generation, lumen, and nanite to enhance realism in virtual environment.
#Unreal Engine
#Procedural Content Generation
#Nanite
#Lumen
#MATLAB
Metahuman to create animations from real-life sports data from NFL, NBA and soccer games so that referees may control a digital embodied agent to place themselves in-game scenarios and make correct calls using Ophthalmic concepts
#Unreal Engine
#Metahuman
#Pandas
#Numpy
Used Unreal Engine and ROS to create a 3D digital twin of a Robot in VR and observe the manipulations in VR first and see the accepted interactions in video-see through AR later
#Unreal Engine
#ROSIntegration
Used morphological operations as well as Mask-RCNN based region segmentation for individual land plot identification as well as automatically assigning plot number by using Bengali numeral OCR
#ArcGIS
#Mask-RCNN
I have research collaborations with neuroscientists, vision scientists, ophthalmologists, sports medicine researchers, medical students, NFL and more. I am very passionate about learning new technologies and working intensely on hackathons.
Visually impaired individuals through the tactile feedback from Haptikos Exoskeleton and using MXInk manage to trace diagrams by connecting dots
#Passthrough
#Haptikos
#Camera Access
#MXInk
DOGO is a visionOS app that takes the viewer on an immersive journey of rural Africa and the efforts by DOGO to improve it
#VisionOS
#RealitKit
#Insta360
#OpenImmersive
HarmonyXR is a collaborative music composition tool for making music creation accessible to both deaf and hearing users. It allows users to modify music through gestures and interact with it through haptics in an immersive mixed reality environment.
#Unity
#Passthrough
#Haptics Studio
#Mixed Reality Utility Kit
#Audio SDK
Combined several live API from NASA with county level agricultural financial data to help county and state policymakers to make informed decision for food security.
#HTML
#CSS
#JavaScript
#PHP
BUGSKILL is an immersive AR game where players use their hands to eliminate flies in a dynamically mapped real-world environment. Players can slap flies on surfaces, clap them mid-air, or use a frog hand puppet to snap them up, all enhanced by spatial audio cues and Meta's advanced hand tracking for realistic interactions.
#Passthrough
#Scene API
#Depth API
#Mixed Reality Utility Kit
#Audio SDK
A iPadOS table top game viewer for real game data. You can walk around the field and enjoy the game from any angle you like.
#SwiftUI
#RealityKit
Used complex network concepts such as degree centrality to understand disease knowledge dynamics in a disease graph and use that to assign ICD code to EHR entries from MIMIC-III dataset.
#Networkx
#MIMIC-III
#Pandas
#Gephi
#NumPy
Used graph neural network along with KNN to segment mammogram images. First open-access implementation of a bilateral mammograph model.
#Detectron2
#PyTorch Geometric
Used Unreal Engine postprocessing materials to create a simulated cataract and metamorphopsia in Martian environment to judge astronaut performance with impaired vision.
#Unreal Engine
#Postprocessing Material
Used Unreal Engine, MetaXR plugin and Quest 2 headset to measure threshold of aniseikonia with dichoptic stimulation.
#Unreal Engine
#Postprocessing Material
NSF Innovation Corps
2020
Participated in the NSF Innovation Corps National Program, receiving $50,000. Conducted customer discovery for EyeSightQuest, focusing on AR-based assistive technology for visually impaired individuals.
XR BootCamp
2024
Won a scholarship of $2000 to the XR Bootcamp and completed 8 weeks of intense work with high honors
University of Nevada, Reno
Fall 2023
Taught graduate students source version control and data visualization techniques using git, github, matplotlib, and seaborn across different operating systems.
University of Nevada, Reno
Spring 2022 & Spring 2023
Supervised deep learning project developments, teaching graduate students to use numpy, cv2, sklearn, torch, and Nvidia's GPU-specific parallel computing cluster infrastructure.
University of Nevada, Reno
Spring 2024
Held lab sessions to teach Raylib C++ game engine architecture.
University of Nevada, Reno
Fall 2021, Fall 2022, Fall 2023
Taught game development and source version control. In Fall 2021, focused on Unreal Engine C++ development. From Fall 2022 onwards, taught Unity and C#.
Academic Services
Multiple Years
Served as a reviewer for prestigious conferences and journals including IEEE VR, Scientific Report, ACM HCI, ACM HRI, Cell Press Innovation, and Brain Imaging and Behavior.
US Army Educational Outreach Program
Summer 2022
Mentored students as part of the US Army Educational Outreach Program.
International Symposium on Visual Computing
ISVC'20
Organized the International Symposium on Visual Computing (ISVC'20).
Bangladesh University of Engineering and Technology
February 2014 - September 2018
Awarded for academic excellence, ranking 56th among the top 100 in undergraduate admission.