If you’re in North Korea and want to use some North Korean branded hardware known as the Ullim, be aware that there’s a level of control to which we’re unaccustomed to in the West. Martyn Williams on 38 North notes the changes that, beyond a hardware modification, falls into 4 categories: Constant Surveillance, Approved Apps Only, File Watermarking, and Restricted Media Compatability. Martyn’s thoughts:
Taken together, the various systems and software on Ullim represent a significant barrier to activists who are hoping the greater spread of portable electronics will increase the ability of North Koreans to freely access information.
“If you do manage to get an app on there and try to install it, it won’t work because the signature is wrong,” said Grunow. “The [Android file] must be signed with the government key. Additionally, there is a check to see if the app is in the whitelist and a normal user cannot get into the code to add to the whitelist.”
“This basically finishes all of your efforts to be a normal user in the DPRK,” he said. “It’s virtual[ly] impossible.”
Unfiltered information is one of the biggest enemies of the North Korean regime so it’s no surprise that engineers have gone to such lengths to lock down the tablet.
This brings a couple of thoughts to mind, the one triggering the second.
First, how much data concerning usage does North Korea face? If tablets and other mobile computing is perceived as a positive by the North Korean dictators, then equipping many citizens, military or not, would be expected – and the amount of information to analyze from these modifications might fall into the same league as that of the British spy cameras, initially feared but perhaps impotent because of the sheer volume of data to analyze.
Which leads to the second question: can they cut down the data volume through simple analytical techniques, or will we be seeing the dark side of Artificial Intelligence developed and deployed to retain control over the North Korean populace? I doubt they would develop a truly independent, sentient AI, so it would be a tool, and thus devoid of moral attributes – and moral choices. So we’re deprived of wondering whether an AI developed in an authoritarian country would develop a morality in favor of command and control, or freedom and (to use an unexpected adjective) chaos.