Language research has made huge strides in recent years, but most of this work has been done on majorly high-resource languages. As new technologies emerge and the digital divide between high-resource and low-resourced languages deepens, it impacts the extent to which users can access information and communicate with these technologies. 

On 17th October, we will hear more on this topic from InstaDeep’s Research Engineer Orevaoghene Ahia who will give the talk “Neural Machine Translation for Low Resource Languages” at GDG Lagos DevFest 2020. In her talk, Orevaoghene will discuss the importance of building Neural Machine Translation models for low-resource languages, emphasising how research communities are a good way of tackling this problem and using the amazing work done to date by her fellow researchers at InstaDeep and in Africa in general as examples.

“Despite the exciting happenings around NLP research, there has been limited focus on low-resourced languages, particularly African languages. If no significant effort is channelled towards these languages, they will be at a loss in the future, given the proliferation of AI in our daily lives. I am therefore excited to present this talk because I hope that it will shed some light on the importance of working on low-resourced languages”, says Orevaoghene. 

As part of the presentation, Orevaoghene will also showcase the demo “Building a Yoruba Translation Model with Tensorflow”.

The event hosted by Google Developer’s Group (GDG) is bringing together thousands of developers for one day only. We are excited to hear the talks and hope you will join! Register for the free AI event here

If you are interested in other language research initiatives, check out our contribution to the Masakhane paper here or read how the InstaDeep Lagos team built a world-first NLP translation model.