Cyber-Physical Systems (CPS) rely on advancements in fields such as robotics, mobile computing, sensor networks, controls, and communications, to advance complex real-world applications including aerospace, transportation, factory automation, and intelligent systems. The multidisciplinary nature of effective CPS research, diverts specialized researchers' efforts towards building expensive and complex testbeds for realistic experimentation, thus, delaying or taking the focus away from the core potential contributions to be made. We present Up and Away (UnA), a cheap and generic testbed composed of multiple autonomously controlled Unmanned Ariel Vehicle (UAV) quadcopters. Our choice of using UAVs is due to the their deployment flexibility and maneuverability enabling a wide range of CPS research evaluations in areas such as 3D localization, camera sensor networks, target surveillance, and traffic monitoring. Furthermore, we provide a vision-based localization solution that uses color tags to identify different objects in varying light intensity environments and use that system to control the UAVs within a specific area of interest. However, UnA's architecture is modular so that the localization system can be replaced by any other system (e.g. GPS) as deployment conditions change. UnA enables interaction with real world objects, treating them as CPS input, and uses the UAVs to carry out CPS specific tasks while providing sensory information from the UAV's array of sensors as output. We also provide an API that allows the integration of simulation code that will obtain input from the physical world (e.g. targets to track) then provide control parameters (i.e. number of quadcopters and destinations coordinates) to the UAVs. The UnA architecture is depicted in Figure 1a. To demonstrate the promise of UnA, we use it to evaluate another research contribution we make in the area of smart surveillance, particularly that of target coverage using mobile cameras. Optimal camera placement to maximize coverage has been shown to be NP-complete for both area and target coverage. Motivated by the need for practical computationally efficient algorithms to autonomously control mobile visual sensors, we propose efficient near-optimal algorithms for finding the minimum number of cameras to cover a high ratio of targets. First, we develop a basic method, called cover-set coverage to find the location/direction of a single camera for a group of targets. This method is based on finding candidate points for each possible camera direction and spanning the direction space via discretizing camera pans. We then propose two algorithms, (1) Smart Start K-Camera Clustering (SSKCAM) and (2) Fuzzy Coverage (FC), which divide targets into multiple clusters and then use the cover-set coverage method to find the camera location/direction for each cluster. Overall, we were able to integrate the implementation of these algorithms with the UnA testbed to witness real-time assessment of our algorithms as shown in Figure 1b.


Article metrics loading...

Loading full text...

Full text loading...

This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error