A neuro-inspired visual tracking method based on programmable system-on-chip platform

Yang, S. , Wong-Lin, K., Andrew, J., Mak, T. and McGinnity, T. M. (2017) A neuro-inspired visual tracking method based on programmable system-on-chip platform. Neural Computing and Applications, (doi:10.1007/s00521-017-2847-5) (Early Online Publication)

[img]
Preview
Text
147256.pdf - Accepted Version

1MB

Abstract

Using programmable system-on-chip to implement computer vision functions poses many challenges due to highly constrained resources in cost, size and power consumption. In this work, we propose a new neuro-inspired image processing model and implemented it on a system-on-chip Xilinx Z702c board. With the attractor neural network model to store the object’s contour information, we eliminate the computationally expensive steps in the curve evolution re-initialisation at every new iteration or frame. Our experimental results demonstrate that this integrated approach achieves accurate and robust object tracking, when they are partially or completely occluded in the scenes. Importantly, the system is able to process 640 by 480 videos in real-time stream with 30 frames per second using only one low-power Xilinx Zynq-7000 system-on-chip platform. This proof-of-concept work has demonstrated the advantage of incorporating neuro-inspired features in solving image processing problems during occlusion.

Item Type:Articles
Status:Early Online Publication
Refereed:Yes
Glasgow Author(s) Enlighten ID:Yang, Dr Shufan
Authors: Yang, S., Wong-Lin, K., Andrew, J., Mak, T., and McGinnity, T. M.
College/School:College of Science and Engineering > School of Engineering
Journal Name:Neural Computing and Applications
Publisher:Springer
ISSN:0941-0643
ISSN (Online):1433-3058
Published Online:20 January 2017
Copyright Holders:Copyright © 2017 The Natural Computing Applications Forum
First Published:First published in Neural Computing and Applications 2017
Publisher Policy:Reproduced in accordance with the copyright policy of the publisher

University Staff: Request a correction | Enlighten Editors: Update this record