{{ vm.tagsGroup }}
05 Jan 2026
6 Min Read
Dr Afizan Azman (Academic Contributor), Nellie Chan (Editor)
Communication is something most of us take for granted. For members of the Deaf and Hard of Hearing (DHH) community, even everyday activities—such as visiting a hospital, accessing public services, or participating in education—can be fraught with barriers. Many rely on sign language interpreters, but when they are not available, communication can break down entirely.
At Taylor’s University, Dr Afizan Azman is leading the development of TALK BIM, a smart mobile application designed to improve access for the DHH community. By combining computer vision with natural language processing, his team has built a tool that translates sign language into real-time text, enabling direct communication between Deaf and hearing individuals and fostering greater inclusion and independence in daily life.
Q: What inspired the development of TALK BIM?
A: The project began with a simple observation: the DHH community often encounters communication barriers, particularly in essential settings such as schools, hospitals, and government offices. Through our collaboration with Persekutuan Orang Pekak Malaysia, we gained firsthand insights into the everyday challenges they face. This understanding motivated our team to explore how technology could provide an inclusive solution to bridge the communication gap.
Q: How does TALK BIM translate signs into text?
A: TALK BIM might seem simple at first glance: a user signs in front of a phone camera, and the corresponding text appears instantly. Behind the scenes, the app uses a combination of AI technologies. Computer vision captures hand shapes and movements (with future updates planned to include facial expressions, mouth patterns, and body posture); machine learning recognises patterns and interprets meaning; and gesture and sequence analysis ensures that both individual signs and complete sign sequences are accurately recognised. Trained specifically on BIM, TALK BIM then translates these recognised signs into text.
Q: What major milestone did the project achieve as it developed?
A: A major milestone was achieving real-time recognition and translation specifically for BIM. While other sign languages, such as ASL, have seen similar capabilities, this marked the first successful implementation for BIM—enabling Deaf and hearing individuals in Malaysia to communicate instantly and seamlessly.
Q: What were the biggest challenges in developing TALK BIM?
A: One major challenge was the lack of annotated datasets for BIM. Unlike ASL, there were no large, publicly available datasets suitable for training. To overcome this, we collaborated closely with the DHH community to capture and annotate our own dataset. We also optimised our models to deliver accurate, real-time translation despite limited training resources.
Q: Are there common misconceptions in this area of study?
A: Many people assume sign languages are universal—but BIM is completely distinct from ASL, requiring its own models and datasets. Another misconception is that real-time translation can be achieved simply by applying existing AI tools. In reality, it demands extensive dataset collection, careful annotation, and model optimisation tailored to the specific language and community.
Q: How did the DHH community shape TALK BIM beyond datasets?
A: The DHH community shaped TALK BIM far beyond providing data—they directly influenced its features and overall user experience. In a workshop with around 30 Deaf participants, we tested an early version of the app in real-life conversations. Their feedback on translation speed, camera tracking, and interface design helped us improve the app: faster translations, better recognition of different signing styles and regional variations, and a more intuitive, user-friendly interface. They also inspired ideas for new features, such as a learning mode for hearing users and a voice/ text-to-BIM avatar, ensuring the app truly meets the real communication needs of the community.
Q: Why is this project particularly relevant now?
A: TALK BIM is particularly relevant now because advances in machine learning and the development of a robust BIM dataset have made real-time translation possible, while societal and governmental focus on accessibility and inclusivity continues to grow. This alignment of technology, data, and priorities provides the perfect opportunity for the app to support linguistic research, expand adoption across sectors, and contribute to SDG 4 (Quality Education), SDG 8 (Decent Work and Economic Growth), SDG 10 (Reduced Inequalities), and several other related goals.
Q: Who stands to benefit most from TALK BIM?
A: The primary beneficiaries are members of the DHH community, who now have a tool that enables real-time communication. Healthcare facilities, government offices, educational institutions, and customer-facing organisations also stand to benefit, as the app helps them engage more effectively with Deaf individuals without the need for a sign language interpreter.
Q: How could TALK BIM be implemented across different sectors?
A: Implementing TALK BIM across healthcare, education, government services, and retail or service industries begins with continuous system training. The app, which already recognises a core set of BIM signs, needs to expand its vocabulary, adapt to regional variations, and interpret natural, continuous signing to perform reliably in real-world settings. Deployment can then proceed through pilot sites, such as selected hospitals or schools, to test workflows, gather feedback, and refine operations. Integration with existing devices ensures staff can use the app seamlessly without extensive training. Finally, collaboration with national agencies and the DHH community builds trust, ensures cultural accuracy, and supports sustainable adoption—together, these steps form a clear roadmap for scalable implementation.
Q: What keeps you motivated in this work?
A: What keeps me motivated is seeing how technology, developed alongside the people it serves, can make a real difference. Working closely with the DHH community has shown me the impact of collaboration and the importance of providing equal opportunities. Every time I see someone communicating more freely and participating fully, it reminds me why this work matters and drives me to continue contributing to projects like TALK BIM.
Q: How has your academic and teaching experience influenced this project?
A: My expertise in AI, computer vision, and natural language processing guided the technical direction of TALK BIM, helping me determine the models and algorithms needed for real-time sign language translation. Equally important, my experience teaching data analytics and machine learning reinforced structured problem-solving and project design skills, which were crucial for addressing challenges such as dataset creation and model optimisation.
Driven by Dr Afizan’s vision, TALK BIM is opening new possibilities for the DHH community in Malaysia, empowering them to communicate, access opportunities, and participate fully in society. More than a technological breakthrough, the project demonstrates how innovation can create tangible, everyday impact when developed with its users in mind.
The next phase focuses on close collaboration with the community to refine the application for wider deployment, alongside new features such as a learning mode for hearing users and a voice/text-to-BIM avatar enabling two-way communication.
In a world where technology often outpaces people, Dr Afizan’s work reminds us that progress is most meaningful when it connects individuals, not just machines.