The ACTIVE project focuses on the major research field of automatic Human Activity Recognition (HAR); a field that constitutes the fundamentals of Ambient Intelligence (AmI) and Assisted Living Applications (AAL). In domestic AAL-oriented applications, HAR is typically performed through the processing of data derived from sensing modalities which build upon: (a) wearable sensors, (b) static IoT sensors and (c) dynamic sensors, such as sensors of mobile robot units. The ACTIVE project aims to advance the State-of-Art in the field of multimodal HAR, by (a) investigating a novel, adaptive and context-dependent fusion method for data derived from wearables and static IoT sensors of a smart home and (b) introducing a novel approach for fusing wearables and static IoT sensors with a mobile-robot based dynamic sensing modality within the smart home. More specifically, our research starts with investigating a new hierarchical, multi-level approach for dynamic fusion of multiple sensing modalities, specifically based on wearables and static IoT sensors of the smart home environment. On top of this, we are developing a novel goal-oriented, ensemble coordination framework, capable to steer the efforts of the integrated system so as to enhance both (a) the effectiveness of activity recognition and (b) the level of detail and robustness in behavior understanding. The proposed methods for human activity recognition and behavior understanding through the fusion of (a) wearables and IoT data and (b) wearables, IoT and mobile robot vision, will be systematically evaluated in the course of the project. By the end of the project, we will establish a proof-of-concept integrated wearables/IoT/robot system, capable to operate with our proposed methods in a real smart home. This will be used for the final validation of the proposed approach, which will take place in the CERTH-ITI smart home, utilizing also the RAMCIP robot.