Concerns over the implications of artificial intelligence (AI) for the younger generation are mounting. As AI tools have permeated our society, this Post report suggests Nepali children as young as seven are adept at using them.
Millennials and Gen Z grew up
gradually adopting the tools of science, technology, and information and
communication. Generation Alpha is now grappling with the emergence of AI
tools, which will be an even bigger conundrum for Generation Beta (those born
between 2025 and 2039).
Again, Nepal will not be an
exception to this trend. Given AI’s increasing (and often seamless) integration
into our daily life, there are many ways we can benefit from it. Yet, if the
use of AI tools gets out of hand, there will be multifarious problems for the
people and society.
Nepal is still in the early
stages of using AI tools and technologies and has as such been a laggard in
establishing AI regulation. Our laws don’t specifically deal with AI issues.
For instance, the Digital Framework Nepal 2019 envisions enhancing the
country’s digital landscape through digital literacy and the advancement of
information and technology, but it lacks provisions to address AI’s growing
role in society.
Similarly, the Electronic
Transactions Act of 2008 falls short of tackling modern challenges such as
AI-induced misinformation, child exploitation and cybercrime.
While these laws were enacted
before many new-generation AI tools emerged, time has come to tweak them in
line with the changing times. Other institutions also need to be revamped. The
Cyber Bureau of Nepal Police is working to safeguard the society against cyber
crimes but it is woefully understaffed.
Likewise, the Ministry of
Communication and Information Technology, the line agency for AI regulation, is
working with limited manpower and resources.
The absence of even minimal
AI regulations is concerning. As more and more children are exposed to the
world of the internet, they are vulnerable to deepfakes (fake videos, audio and
images), AI-generated child sexual abuse material (CSAM), cybercrimes, and so
on.
This is evident from the 2023
report by the Internet Watch Foundation, an organisation based in the UK
working to stop child sexual abuse online, which revealed that a single dark
web had posted more than 20,000 CSAM images in a one-month period.
Moreover, concerns are
growing over how the use of AI in generating texts, solving mathematical
problems and doing homework will affect children’s creativity and critical
thinking.
Nepal can learn from other
countries that have strived to deal with the challenges posed by AI. The United
Arab Emirates, for instance, has established a Ministry of Artificial
Intelligence. There is a comprehensive AI Act in the European Union. Similarly,
the UK has a specialised AI authority called the ‘Office for AI’.
In our own neighbourhood,
India is working on the Digital India Act to regulate high-risk AI
applications. Examples from abroad suggest things like updating the country’s
digital policies, investing in regulating AI without compromising the need for
people’s right to these tools and incorporating AI education and awareness into
school and college curriculum could help. As vital will be equipping relevant
departments with human resources and modern tools.
The AI tools are evolving
rapidly, not by the year or even by months—the evolution is now happening in
hours. Nepali citizens, and its youngsters in particular, need all the help
they can get in safely navigating this complicated maze.
0 Comments