Abstract. Wearable inertial sensors have become an inexpensive option to measure the movements and positions of a person. Other techniques that use environmental sensors such as ultrasound trackers or vision-based methods need full line of sight or a local setup, and it is complicated to access this data from a wearable computer’s perspective. However, a body-centric approach where sensor data is acquired and processed locally, has a need for appropriate algorithms that have to operate under restricted resources. The objective of this paper is to give an overview of algorithms that abstract inertial data from bodyworn sensors, illustrated using data from state-ofthe-art wearable multi-accelerometer prototypes.