One of the serious concerns for any parent with the digital ecosystem is how do they keep their children safe from the ‘digital ghost’, and Edtech could an easy trap.
Kid safety is of paramount concern. With digital exposure there are new challenges which result in some other types of potential harms caused by ‘digital ghosts’ or ‘digital bhoots’, which exist in the ecosystem.
There is no denying that the digital exposure of children has increased. It has also gone beyond regular schooling and parents are encouragingly promoting children to opt for several courses and certifications offered digitally. This has resulted in a thriving edtech space, and over the past 6 months or so, we have even seen edtech companies acquiring the brick-mortar education brands.
There are various flavours being offered in the edtech space with the objective of differentiating and attracting as many students as possible. A lot of content is being recreated to go well with this mode and facilitate easy self and virtually assisted learning.
At the same time, these edtech companies completely understand that they are dealing with tender ages meaning the Internet exposure to them cannot be like an adult. For this, they adhere to ‘kidsafe’ practices and also ensure that they are using content which does not harm children in any way – especially psychologically.
The edtech companies are cognizant of making the experience safe and children friendly. However, that is all within the boundaries of their application, where they have all the control. But there is a lot more that can be counterproductive to a much larger engagement who may even never cross the line to experience what all measures they take to make every element inside the application child safe and friendly.
Other than children of less than 10 years of age, parents do allow children to explore such opportunities on their own. When in their teens, children also get a chance to recommend to parents which courses they want to pursue. In the latter case, children to get exposed in the discovery of such applications, which means they interface with the promotional / advertising environment of edtech apps. For children less than 10, parents will primarily explore for such solutions.
In both the situations, its essential for the decision makers to get assurance about that the application they want to go with, isn’t harming children in any way, especially the ones which could contribute to development of any negative trait in children as they would not have the capability to handle such exposure at that tender age. For instance, hate language, obscenity, crime, and other such content.
In the broadcasting world, we have clear demarcations of what is suitable to children and what’s not. That is why even an action content isn’t advised to children and certain content is explicitly categorised as adult only.
Due to brand safety issues, many a times without any intervention of the edtech solutions provider, the ads do get placed wrongly which could be carrying content not suitable for children. This means that children as well as parents exploring for such solutions could land up in ‘bad areas’ of the Internet.
While children could land up in entirely a wrong territory which isn’t suitable for them, parents would get shocked to see the affiliation of the platform which they are exploring for their children. The issue could get worse as the ads are served basis the content consumption pattern and interests of the user with whom the device is profiled. In these cases, it would be a parent, an adult, whose profile would be targeted by advertisers through ad-networks, affiliates and other mediums.
A responsible and aware edtech platform has to look at the things end-to-end and make the entire experience children safe and friendly, not just inside the app when someone onboards. In fact, this could result in that majority of the potential users will end up with a bad experience and impression about the platform while only the ones who convert and sign-up appreciate the proactive measures taken by the app to give children their due environment.
This ‘digital ghost’ or ‘digital bhoot´is something that we need to keep children away from. Otherwise what a ghost invisibly does in the real world to the minds by causing psychological damage, which at times longs for lifetime, could get replicated in the virtual world with children at a very tender age. Unfortunately, edtech is the platform which has a high risk of carrying this invisible ghost, harming children while attempting do to better for their overall development.
Brand Safety is a crucial thing to address for any brand, especially the ones who deal directly with potentially vulnerable sections like children. mFilterIt is already engaged in this space with a few proactive edtech platforms who are piloting some activities with us in this direction. However, this should become an industry/vertical hygiene where the objective is to make the entire experience children safe, not just inside the app, which is a controlled world for the platform. To know more about mFIlterIt’s edtech children safety initiatives, get in touch with us today.