Skip to content
@EasyWalk-PRIN

EasyWalk Project

EasyWalk: Intelligent Social Walker for active living

project pipeline

In the last decade, several studies have proposed solutions to prototype a smart walker. Several approaches may vary from user-guided walkers with an intelligent braking system to autonomous platforms featured by people detection modules and remote teleoperations. However, despite the technological and scientific efforts, to date the translational impact of the current smart walkers is still limited. We identify three main reasons for that: i) current approaches lack reliability and robustness—especially in daily life operations and in natural scenarios; ii) the coupling between humans and machines is underestimated. The user is guided or s/he autonomously drives the walker; in most cases the integration of the intentions of the two actors (user and machine) is minimal; iii) the cost of the platform is often prohibitive and this definitely jeopardizes the vast adoption of the technology, and especially, it prevents a continuous personal usage at home. As a direct consequence, users—and in particular older adults—do not trust the walker, and thus, they do not use it.

In the EasyWalk project we hypothesize that such a trust can be achieved by providing a walker that is low-cost, easy-to-use, reliable and able to infer, to integrate and to contextualize the user's intentions according to the environment information.

The project will face these challenges by proposing innovative solutions based on human-robot interaction approaches. In details, EasyWalk will pursue the following specific objectives:

  • building a smart walker equipped with a minimum amount of inexpensive sensors (i.e., commercial RGB-D camera, low-price laser range-finder, force sensors);

  • exploiting the information of such sensors to provide a safe and social navigation in unknown and cluttered environments (i.e., without a predefined global map);

  • inferring user’s motion intention (i.e., to start and stop walking, and to change direction) from the movements of the legs and the forces applied to the walker’s handlebar;

  • integrating together user and robotic intelligences in a novel and trustworthy shared-intelligence approach based on the fusion of techniques such as potential fields and model predictive control.

We foresee that the fulfillment of the aforementioned goals will lead to the creation of an innovative smart walker that is able to interpret the user's intentions, and at the same time, suggest and support safe navigation. Furthermore, we expect that the enhancement of the coupling between user and walker will facilitate the ease of use, and thus, the user’s trust in the robotic device.

Popular repositories Loading

  1. OpenNav OpenNav Public

    Official code for the OpenNav: Efficient Open Vocabulary 3D Object Detection for Smart Wheelchair Navigation - ACVR Workshop at ECCV'24

    Python 4 1

  2. .github .github Public

Repositories

Showing 2 of 2 repositories
  • OpenNav Public

    Official code for the OpenNav: Efficient Open Vocabulary 3D Object Detection for Smart Wheelchair Navigation - ACVR Workshop at ECCV'24

    EasyWalk-PRIN/OpenNav’s past year of commit activity
    Python 4 MIT 1 0 0 Updated Oct 30, 2024
  • .github Public
    EasyWalk-PRIN/.github’s past year of commit activity
    0 0 0 0 Updated Aug 20, 2024

People

This organization has no public members. You must be a member to see who’s a part of this organization.

Top languages

Loading…

Most used topics

Loading…