LiDAR technology can accurately generate a 3D representation of the environment which creates numerous opportunities for improving assistive technology for blind and visually impaired individuals. Unfortunately, LiDAR is expensive and only available in high-end mobile devices, including the iPhone 12 and 13 Pro, making assistive technology inaccessible to the majority of users. Software LiDAR aims to imitate the capabilities of LiDAR by integrating the output of MiDaS, a neural network for depth estimation,and Apple’s ARKit to bridge the gap in technology and make assistive technology more accessible to users. Software LiDAR was created and benchmarked against LiDAR, and found to be fairly accurate, especially after removing failure cases. Moreover, codesigners expressed a desire for using the MiDaS depth data to create object detection, showing its potential in assistive technology and motivating the future development of software LiDAR.