AIEnglish

Published February 28, 2024{/}A Digital Policy Report Card for South Korea

By 2024/03/04 No Comments
Korea’s Path to Digital Leadership

A Digital Policy Report Card for South Korea

 

SOURCE:    A Digital Policy Report Card for South Korea
AUTHORED BY:   Byoung-il Oh
ORGANIZATION:  Korean Progressive Network Jinbonet

For more than three decades, South Korea has made building its digital sector a high priority. The country is a world leader in semiconductor chip production and has one of the highest broadband penetration rates in the world. The online content available to Koreans is varied, and the country’s content industry is competitive and thriving. The International Telecommunication Union’s 2023 ICT Development Index ranked Korea eighteenth for digital development overall and twenty-second for connectivity.1

Significantly, these rankings were lower than those the country had scored in previous years.2 Despite being a leader in the global digital economy, Korea is not advancing as rapidly as it had in the past. This is due in part to the government’s inability to work effectively with industry leaders and the public to craft clear policies in several key areas. The bad news is that Korea has spent decades trying to sort out thorny issues like online authentication. The good news is that President Yoon Suk-yeol and his predecessor, Moon Jae-in, have been personally committed to improving how Korea addresses challenges like data protection and digital identity.

The performance of Korean digital policymakers over the last decade or so presents a mixed picture, and much more remains to be done. A consistent theme is that previous initiatives have often provided a clear vision, but implementation has been disappointing. This is due in part to a lack of broad buy-in from the industries and stakeholders involved.

PRIVACY, DATA PROTECTION, AND DATA INFRASTRUCTURE POLICIES: GRADE B

Korea’s personal information protection laws are very strong, but the implementation and enforcement of these laws has been inconsistent. It was only in 2020 that the Personal Information Protection Commission (PIPC) was established as a practical supervisory body and the relevant laws were unified. Still, further personal information protection legislation was needed to respond to the development of new technologies. The government therefore advanced laws to more clearly establish the rights of data subjects related to the automated processing of personal data. But because of the government’s desire to promote the use of personal information for the development of the big data industry, the rights of Korean data subjects are more limited than those provided by the EU’s General Data Protection Regulation (GDPR), and many aspects of personal information protection still need to be strengthened in Korea.

PERSONAL INFORMATION AND BIG DATA

An expression often used to emphasize the importance of the data industry is, “Data is the new oil” of the twenty-first century. Personal information is one of the most important types of data. However, for all Korea-based companies—including social media platforms, financial institutions and small retailers—to use personal information, they must demonstrate a legitimate basis under the Personal Information Protection Act (PIPA). In the private sector, personal information is usually processed with the consent of the data subjects. Of course, it is not easy to obtain the consent of every user of an online service that might be accessed by tens of millions of people. And what constitutes consent can vary. The processing of personal information for big data analysis is often done for reasons different from the purpose for which the information was collected in the first place. Therefore, as a way to use personal information without data subjects’ consent, Korea has promoted the concept of de-identification.

The De-identification of Personal Data

Since the early 2010s, government departments such as the Ministry of the Interior and Safety and the Korea Communications Commission (KCC) have begun to create guidelines that allow de-identified personal data to be processed without the consent of data subjects for the purpose of revitalizing the big data industry.

In June 2016, the government of president Park Geun-hye released the Guidelines for De-identification of Personal Data, which integrated separate sets of guidelines that had previously been published by different ministries.3 The guidelines defined de-identification as “measures to make individuals unidentifiable by means of, for example, deleting or replacing all or some of the elements from [the] dataset.” Any de-identified data that have undergone appropriate de-identification in accordance with the guidelines are presumed to be nonpersonal data and thus can be used for big data analysis or provided to third parties without the consent of data subjects.

At the same time, the guidelines prohibited the public disclosure of de-identified data in principle, because there can be a high risk of re-identification. De-identified data must be accompanied by security measures, as there is a possibility of re-identification if de-identified data are leaked and combined with other data. However, despite the fact that de-identified data can be re-identified, the guidelines presume that such data are nonpersonal and therefore exempt from the application of the PIPA. Finally, the guidelines require the government to designate so-called specialized agencies that focus on data de-identification to support the combination of data sets held by different data controllers.

Civil society groups criticized the guidelines for violating the PIPA. The concept of de-identification had not been considered when the act was originally passed, and it was unclear whether de-identified data were personal information or not. Even if the data subject was identified during de-identification, the data controller was exempt from liability if the data subject was de-identified again.

In particular, when combining data sets, it is difficult to view such data as nonpersonal since this process requires a common identifier. Therefore, the combination of data sets through a specialized agency may violate the PIPA by providing personal information to a third party without the consent of data subjects. According to a 2017 parliamentary inspection of the administration, 340 million pieces of consumer data were combined from August 2016 to September 2017 under the de-identification guidelines. In November 2017, civic groups filed a complaint against specialized agencies and twenty companies for violating the PIPA.4 Companies have since stopped processing personal data in accordance with the guidelines, though they have not officially been repealed.

Regulatory and Institutional Innovation Hackathon

Both Park and her successor, Moon, touted the Fourth Industrial Revolution and stressed the need to promote the big data industry. As a result, the government’s approach to data protection policies did not change when Moon took office in May 2017; although the words used to describe the policy changed. Under Moon, there was more emphasis on consulting with stakeholders rather than merely pushing the government’s policy.

In early 2018, the Presidential Committee on the Fourth Industrial Revolution held a series of events called the Regulatory and Institutional Innovation Hackathon to gather relevant stakeholders from the government, industry, civil society, and academia to discuss and seek consensus on key issues related to the digital revolution. Two stakeholder meetings were held at the second and third installments of the hackathon under the agenda of harmonizing the protection and use of personal information.

At the first meeting, the participants agreed to work toward establishing the legal concepts of personal information, pseudonymized information, and anonymized information to refer to personal data, rather than using the term de-identification. They said the term de-identification was ambiguous, because depending on the level of de-identification, the new data sets may still be personal information or they may be anonymized information processed to make it impossible to re-identify the data subjects. The participants agreed that anonymized information would not be subject to the PIPA and would be distinguished from personal information.

To clarify the concept of anonymized information, instead of defining this term in law, participants at the meeting discussed supplementing the concept of personal information by referring to recital 26 of the EU’s GDPR, which distinguishes between truly anonymous data and data that has been de-identified but might still be traced to an individual.5 (The GDPR recitals provide additional context to accompany the regulation’s articles.) The group also decided to establish a legal basis for the definition and use of pseudonymized information. Finally, participants agreed to conduct additional discussions on major issues regarding the protection and use of personal information.

Although the participants of the first meeting reached an agreement on basic concepts, their conflicting agendas meant there was no consensus on the details. Thus, the issue of personal information was dealt with again at the second meeting, where participants discussed issues such as the use of pseudonymized information, the combination of data sets, and oversight mechanisms.

Civil society participants argued that the use of pseudonymized personal information for purposes other than those for which it was first collected, and providing it to a third party, would be a restriction of the data subject’s rights. These participants also argued that the use of pseudonymized information for other purposes should be limited to academic research and statistical compilation with public-interest value that benefits society as a whole. Industry representatives, meanwhile, argued that such use of pseudonymized information should be broadly allowed for industrial and market research to develop the big data industry. In the end, the second meeting did not reach a consensus on these issues, and the final report included all of the different positions expressed.6

The So-Called Three Data Laws

After the hackathon, the Moon government proposed three new data protection laws to the National Assembly on November 15, 2018.7 The laws consisted of amendments to the PIPA, to the Network Act, and to the Credit Information Act. One goal of the proposed amendments was to foster the growth of Korea’s data industry. Various civil society groups criticized the government, calling the laws the “personal information theft acts.”8 The government promoted the amendments as having been based on the social consensus achieved in the hackathons, but civil society groups argued that the proposed legislation reflected corporate positions on issues that the hackathon participants could not agree on. Despite the opposition from civil society, the National Assembly passed the laws on January 9, 2020.

Although the laws were packaged as the three data acts, the provisions of the Network Act were ultimately incorporated into the corresponding provisions of the PIPA. Therefore, the main changes to Korean data protection law were the amendments to the PIPA and to the Credit Information Act. Two main aspects of the legislation stand out.

First, the laws introduced pseudonymized information as a legal concept. The amended PIPA defines “pseudonymization” as “a procedure to process personal information so that the information cannot uniquely identify an individual without additional information”—and therefore defines information that has been through this process as pseudonymized information.9 A personal information controller may process pseudonymized information without the consent of data subjects for purposes such as statistical analysis, scientific research, and archiving that is in the public interest.

The laws allowed a specialized institution designated by the PIPC or the head of a related administrative agency to combine pseudonymized information from different personal information controllers. When processing such information, the controller must ensure that technical, organizational, and physical safety measures are followed. No one may process pseudonymized information for the purpose of identifying an individual, and violations of this rule are punishable with a fine.

Second, the laws integrated the authority to supervise personal information—a power previously held by the KCC and the Ministry of the Interior and Safety—into the PIPC, which became a central administrative agency. However, the provisions of the Credit Information Act relating to personal information were not incorporated into the PIPA, and so the supervision of such information in the financial sector—that is, personal credit information—remained under the purview of the Financial Services Commission. Provisions on the processing of pseudonymized information for scientific research purposes were also included in the Credit Information Act, but the specific wording used was slightly different from that in the PIPA, which may cause confusion.

The Scope of Scientific Research

At the hackathon, civil society and industry representatives expressed different opinions about the purpose and scope of the use of pseudonymized information. The government’s use of the phrase “scientific research” in the PIPA amendment reflected the desires of industry advocates. The revised act defines scientific research as that which “applies scientific methods, such as technological development and demonstration, fundamental research, applied research, and privately funded research.”10 Research conducted for commercial purposes within a company also fits this definition, as long as it uses the scientific method. A document that accompanied the PIPA amendment states that pseudonymized information may be used for scientific research, “including industrial purposes such as the development of new technologies, products, and services based on data, statistics for commercial purposes such as market research, and archiving purposes in the public interest.”11

Civil society groups argued that the new definition of scientific research would allow for the processing of pseudonymized information for purposes other than those permitted as long as the controllers claimed to be conducting research.12 That is because research does not usually involve unscientific methods, so any research could be said to be scientific. Civil society representatives maintained that the use of personal information without the consent of data subjects should be limited to academic, rather than scientific, research. That is because processing personal information for any purpose other than the original intention limits the rights of data subjects, and to justify such a restriction, there must be a corresponding social value and public interest.13 The previous wording of the PIPA had used the term “academic research” instead of “scientific research”—but did not define it.14

The definition of scientific research in the PIPA amendment was borrowed from the EU’s GDPR. Although the GDPR itself does not define scientific research, recital 159 explains it by stating that “the processing of personal data for scientific research purposes should be interpreted in a broad manner including for example technological development and demonstration, fundamental research, applied research and privately funded research.”15 The recital further states that such processing should take into account the EU’s objective of strengthening its scientific and technological bases by creating a European Research Area in which researchers, scientific knowledge, and technology circulate freely.

The European Data Protection Board, which is responsible for ensuring consistent application of the GDPR, has not yet issued a specific opinion or guideline on scientific research. However, recital 159 makes it seem that scientific research under the GDPR is not limited to research conducted in a particular field or by a particular institution but comprises research that can contribute to a common intellectual community called the European Research Area. In reviewing the concept of scientific research under the GDPR in 2020, the European Data Protection Supervisor (EDPS) stated that “for a controller to simply claim to process data for the purposes of scientific research is not sufficient” and that “it is a common assumption that scientific research is beneficial to the whole of society and that scientific knowledge is a public good to be encouraged and supported.”16

Some Korean civil society members are concerned that by pseudonymizing consumers’ personal information, companies will be able to further process it for purposes unrelated to that for which it was originally collected, combine it with the personal information of other companies, and share it or sell it to other firms in the name of scientific research—without the consent of data subjects.

For example, personal information held by telecommunications companies is very valuable. These companies can pseudonymize consumers’ personal information and use it for their own research purposes or provide it to other firms, such as insurance companies, for research purposes—normally for a fee. Having been pseudonymized, personal information can be shared or sold to numerous companies. It cannot be ruled out that users may be identified and impacted in unanticipated ways when their personal information is combined with other such information.

Strengthening the Powers of the PIPC

The PIPA was first enacted in 2011 to establish a general law applicable to all personal information controllers in the public and private sectors. Therefore, when the PIPA was passed, the relevant provisions of other laws that had previously governed personal information in specific areas, such as the Network Act and the Credit Information Act, should have been repealed. However, these existing laws were maintained because the ministries and bodies in question refused to relinquish their supervisory authority. This led to criticism, especially from stakeholders in academia and civil society who argued that overlapping or similar regulations existed in different personal information laws, causing confusion and increasing the burden of compliance for controllers.

In addition, while the PIPA established the PIPC as a presidential agency, the act only granted the commission certain powers, such as of the ability to deliberate and resolve matters concerning the interpretation and operation of personal information protection law. Meanwhile, the Ministry of the Interior and Safety remained the competent ministry and supervisory authority for the PIPA. Therefore, the provisions of the 2020 amendments to unify Korea’s personal information protection laws and integrate the supervisory bodies into the PIPC were desirable steps that had been demanded by civil society and academia.

However, these amendments still have shortcomings. Most notably, because the provisions of the Credit Information Act relating to personal information were not integrated into the PIPA, the Financial Services Commission retains its supervisory authority over personal credit information. As a result, the goal of unifying the similar and overlapping provisions of related laws into the PIPA was only half realized. Overlapping regulations therefore still exist between the PIPA and the Credit Information Act, causing confusion. For example, the two acts refer to scientific research in different ways.

The delay in integrating Korea’s personal information protection laws and oversight bodies for nearly a decade, until the PIPA was revised in 2020, was due to the selfishness of government departments that did not want to give up their authority. At the time of the hackathon in early 2018, participants from government ministries opposed even putting the supervisory system on the agenda for discussion. So why did the 2020 data laws partly succeed in integrating the existing legislation and the supervisory bodies?

The superficial intention of the 2020 amendments was to strengthen supervision of the use of personal information to pave the way for the introduction of pseudonymized information. Yet, the belated reconciliation of interdepartmental interests was probably also due to the fact that an independent supervisory authority was necessary to obtain an adequacy decision under the GDPR—essentially, a ruling from the European Commission that Korea provides an adequate level of protection for personal data transferred from the EU. The Korean government had formed an EU Adequacy Assessment Task Force in August 2015 and conducted adequacy negotiations with representatives of the EU. But in October 2016, the European Commission ruled that Korea’s personal information supervisory body lacked independence and authority. Eventually, after the independent and empowered PIPC was established in August 2020, the European Commission adopted an adequacy decision for Korea in December 2021.17

THE RIGHTS OF DATA SUBJECTS

On September 28, 2021, the government proposed another PIPA amendment, known as the Second Amendment, to the National Assembly. The assembly approved the revised act on February 27, 2023.

Highlights of the Second Amendment

The Second Amendment included extensive provisions. Three major revisions are noteworthy.

First, the special PIPA provisions on the processing of personal information by providers of information and communications services, which had been borrowed from the Network Act, were repealed. Other relevant provisions of the act were revised so that they would apply equally to all personal information controllers, whether or not they are information and communications service providers.

Second, the legislation introduced rights for data subjects that need to be protected as new technologies such as artificial intelligence (AI) evolve. These are the right to request the transfer of personal information and the right to control over automated decisions. The former refers to a data subject’s right to ask a personal information controller to hand over personal information about that data subject either to them, an institution that specializes in personal information management, or a person who can take appropriate security measures and meet relevant technical standards. The right to control over automated decisions guarantees that a data subject can reject a decision, or request an explanation of it, if the decision was made using a completely automated system, including AI, and has a significant impact on their rights or obligations.

Third, the amendment addressed previous shortcomings in the PIPA. The legislation established new provisions related to the installation and operation of mobile video information-processing devices, and it supplemented the rules for the overseas transfer of personal information. In addition, the Second Amendment changed the sanction method for PIPA violations from a criminal punishment to a fine—effectively, an economic sanction.

Shortcomings of the Second Amendment

It is true that the Second Amendment improved the PIPA in general, first, by resolving the problems of having special provisions that applied only to information and communication service providers and, second, by introducing new rights for data subjects. However, from the point of view of civil society, many areas are still lacking.

First, the PIPA’s level of protection is generally lower than that of the EU’s GDPR. Under the GDPR, an organization can receive a fine of up to 4 percent of its global annual turnover for violations. But under the Korean act, the upper limit of any fine is 3 percent of total turnover, and the basis on which the fine is calculated excludes turnover that is not related to the violations. Companies were the stakeholders most opposed to this provision, and their opinions were partly accepted by the National Assembly during the deliberation of the bill.

While the Second Amendment established new data subject rights about automated decisions, the amendment did not include the term of “profiling,”18 unlike the GDPR. In addition, while the GDPR restricts decisions based solely on automated processing that has a significant impact on individuals, except in certain cases, the PIPA permits such decisions and grants data subjects the right to reject or opt out of them, with the same exceptions. It is questionable whether such a right can be properly guaranteed.

In the case of the GDPR, data subjects have the right to be notified about the processing of their personal information regardless of the legal basis for that processing, whereas under the PIPA, a data subject is notified of the relevant facts only when the personal information has been given with their consent. Therefore, if an automated decision is made about a data subject based on the legitimate interests of the controller, the data subject may be unaware of that decision and therefore unable to exercise their right to reject it.

Second, the right to request the transfer of personal information under the PIPA is similar to the right to data portability under the GDPR. However, while the GDPR provides for the right to request the transfer of personal information to a data subject or another data controller, the Korean law also specifically mentions transfers to third-party institutions that specialize in personal information management—referred to as My Data providers.

Whereas the policy of pseudonymized information was intended for the use of personal information without the consent of data subjects, the My Data policy sought to promote the use of such information with their consent. The Moon administration promoted the My Data policy, which civil society groups criticized as accelerating the commercialization of personal information. Although the policy is based on consent, it is possible for data subjects to consent to the provision of their personal information without being sufficiently aware of the negative impacts that the My Data businesses might have on them. In addition, when My Data providers integrate personal information in fields such as telecommunications, healthcare, and finance, the negative consequences for data subjects’ rights can be even greater.

Third, the Second Amendment did not reflect the improvements requested by civil society groups in written opinions submitted to the PIPC before the amendment was announced. These demands included provisions to strengthen the accountability of personal information controllers, such as impact assessments for private-sector controllers; strengthen data subjects’ rights, such as the right to be notified of the processing of their data; unify the Credit Information Act, the Location Information Act, and the PIPA; strengthen the requirements for investigative agencies wishing to access personal information, such as a warrant to access information held by public institutions; and enhance the remedies for PIPA violations, including class action lawsuits. Civil society groups proposed amendments to the PIPA, but these were not considered by the National Assembly. For this reason, civil society does not have a favorable position toward the Second Amendment.

SUMMARY

The 2020 and 2023 revisions of the PIPA were driven by the need to foster new industries, such as big data and AI. Although the main purpose was to promote the use of personal information, provisions to protect such information were also included as a counterweight. Many of the new provisions refer to the GDPR, although the EU regulation was not copied verbatim.

Overall, Korea’s legislation on the protection of personal information is similar to the European system, and although the Korean level of protection is not low, it was intentionally set at a lower level than that of the GDPR. Some experts in Korea believe that one motivation for the EU’s strong personal information protection is to keep U.S. technology companies in check, because the European technology industry is not highly competitive. These experts use this reasoning to support the argument that Korea’s level of personal information protection should not be raised to a similar level to the EU’s.

While this interpretation may not be entirely erroneous, it distorts the point of data protection policy. Does the goal of fostering Korea’s domestic industry mean that the personal information of Korean citizens should be less protected than that of European citizens? It is not necessary to replicate EU policies into Korean law, but it is problematic to prioritize the need to foster domestic industries without discussing the pros and cons of the policies themselves.

CYBERSECURITY OF GOVERNMENT SYSTEMS: GRADE F

Because Korea has an advanced internet infrastructure and e-government system, cybersecurity is very important. But the country’s cybersecurity governance is lagging behind other countries’. It was only in 2019 that the Moon administration first established a national cybersecurity strategy. Prior to that, the government had created what it called comprehensive countermeasures, rather than an overall strategy, in the wake of major cybersecurity incidents, such as when Korea’s nuclear operator was hacked in 2014.19

The 2019 National Cybersecurity Strategy is essentially only an outline: it does not include specific implementation plans and was created without in-depth discussions with stakeholders, including civil society. Perhaps because of the tense relationship with North Korea, South Korea’s cybersecurity policy overemphasizes the aspect of national security.20 In particular, the National Intelligence Service (NIS) is responsible for cybersecurity in the public sector, a setup that hinders the development of Korea’s cybersecurity governance.

The NIS controls the National Cybersecurity Center and carries out tasks such as establishing national cybersecurity policies, detecting and responding to cyber attacks on public sector networks, and verifying security suitability and cryptographic modules for IT products used by public institutions. However, civil society members have criticized the NIS’s cybersecurity work for having a weak legal basis. Although the service’s work on digital public sector infrastructure and on security and cryptographic verification is based on relevant government acts, there was no similar legal basis for establishing national cybersecurity policies or regulating the cybersecurity of public sector networks. These moves were based on the 2005 National Cybersecurity Management Regulations, which resulted from a presidential order with no higher legal basis.

For this reason, the NIS has been trying to establish a legal basis for its cybersecurity work and, to this end, has proposed legislation: the National Cyber Terror Prevention Act and the National Cybersecurity Framework Act. However, these efforts failed in part because of societal opposition. There has been a great deal of concern that the NIS—a secretive intelligence agency notorious for its surveillance of civilians and politicians, its fabrication of espionage cases, and its interference in politics21—could expand its surveillance power throughout cyberspace. Ironically, however, when the act that created the NIS was amended on October 19, 2021, for the purpose of reforming the service, the revised act specified the NIS’s authority in the field of cybersecurity.

One of the key responsibilities that allowed the NIS to abuse its power in the past was the service’s investigatory authority. Therefore, the Moon administration, which had pushed for NIS reform, revised the service’s duties to abolish its investigatory power while adding further cybersecurity-related tasks. Despite criticism from civil society about the NIS’s cybersecurity authority, the government and the ruling party at the time did not consider these concerns to be important.

The reason why Korean civil society opposes the NIS’s cybersecurity mandate is not just because the service has a history of human rights violations and political interference. Civil society’s distrust of NIS remains high, as its illegal activities continued until recently. In 2015, leaked data revealed that the NIS had been using a hacking program called RCS, developed by the Italian company Hacking Team, for online surveillance.22 There are still major concerns that the NIS’s authority over cybersecurity will allow the service to strengthen its online surveillance and monitoring.

In addition, although cybersecurity intelligence collection might be a valid role of the NIS, it does not have to be the intelligence agency that establishes cybersecurity policies or prevents and responds to cyber attacks. Indeed, if the NIS is responsible for these tasks, cooperation with other stakeholders may become more difficult because of Koreans’ distrust of the service. The participation of civil society is essential to establish cybersecurity policies based on openness and human rights, but the government has not consulted with civil society to this end. Transparency in the NIS’s cybersecurity work and oversight by the National Assembly, the media, and civil society has also become difficult, because the service is subject to fewer transparency obligations and less parliamentary oversight than other government departments.

Although the 2021 revision of the NIS Act stipulated the service’s cybersecurity authority, Korea still lacks consistent and systematic cybersecurity laws. Different terms and concepts are used in cybersecurity-related laws, such as the Network Act, the Information and Communications Infrastructure Protection Act, and the NIS Act and its enforcement ordinances. There is also no national cybersecurity governance mechanism stipulated by law. While there is a need to improve the consistency of these laws, it is unlikely that a societal consensus will be reached in the foreseeable future, as long as the NIS retains its cybersecurity powers.

DIGITAL IDENTITY: GRADE C

Every country has a system of national identification numbers for citizens to access public services. Korea’s system of resident registration numbers (RRNs) has been criticized as a major privacy violation. All Korean citizens are given an RRN at birth, which does not change throughout their lives. The numbering system includes personal information such as date of birth, gender, and place of birth. In the past, RRNs have been collected for personal identification in various fields, both public and private. Korea also uses other numbers, such as a person’s driver’s license number, passport number, and national health insurance number, but all of them are linked to the RRN. RRNs are still collected in major sectors such as finance, telecommunications, and healthcare.

In the early days of informatization, large-scale leaks of personal information occurred frequently. For instance, in 2008, a breach of the internet auction site Auction affected 18 million personal information records. In 2012, 35 million records were leaked from Cyworld, a social networking service. The leaked personal data included RRNs, which acted as a key to link up different types of personal information, increasing the damage caused by the breaches.23

As a result, Korean society has demanded improvements to the RRN system. On August 8, 2014, the National Human Rights Commission of Korea recommended three major changes. The first was to limit the purpose of processing RRNs, so that the numbers would be used only for administrative work related to resident registration and judicial administration, with a separate identification system for other public areas. The second recommendation was to change the numbering system of RRNs to a random number format that does not contain personal information. Third, the commission recommended the creation of a process for data subjects to change their RRN if they wish to do so.

These recommendations have so far been implemented in only a very limited way. On February 17, 2013, the collection of RRNs through information and communication networks was prohibited; and on August 7, 2014, the collection of RRNs was banned in all areas of society without a legal basis. However, since the numbers can be collected under the provisions of laws or enforcement decrees, most public institutions continue to do so. On May 19, 2016, the National Registration Act was amended to allow data subjects to change their RRN. However, such changes are possible only in limited cases where there is a risk of damage to life, body, or property due to the number being leaked.

In the private sector, where the collection of RRNs was already restricted, another identification number similar to the RRN was introduced. Before the advent of Korea’s internet real-name system for identifying users, many internet companies initially used RRNs to voluntarily verify the identities of their users. Later, when RRN leaks became a societal problem, companies were required to adopt an alternative identification method that did not collect RRNs, and so the i-PIN service was launched.

Originally, i-PIN generated a unique user identifier for each digital interaction, which meant that the same user would have a different identifier for each website they signed up to. The government then introduced so-called connecting information (CI) to identify the same user across different operators. CI consists of 88 bytes of information created by encrypting an RRN and is a one-to-one match with the RRN from which it is derived. CI is therefore effectively a second RRN used by the private sector, with the difference that it does not contain personal information. While it should be up to companies to decide how to partner with each other, the Korean government introduced CI without any legal basis to facilitate identity verification in the private sector. In September 2021, civil society groups filed a constitutional complaint, arguing that CI had no legal basis and excessively violated citizens’ basic rights, including the right to privacy.

When a customer signs up for a cell phone service in Korea, it is mandatory to verify their identity through SIM card registration. Because of the combination of RRNs, CI, and SIM card registration, Korean users have no choice but to use the internet based on their real identities, which means that Koreans can be easily tracked anytime, anywhere. The Personal Information Portal,24 operated by the PIPC, provides a service that allows citizens to check their identification details and, if they wish, withdraw their information from a website. A user can see when and where they verified their identity, see what services they signed up for, and request to be removed from sites they no longer want to use. It is ironic that a service that allows the government to know citizens’ internet service subscriptions is offered in the name of privacy.

CONTENT MODERATION: GRADE C

Since the early days of the internet, Korea’s administrative agencies have reviewed online content and demanded that illegal or harmful content be deleted or blocked. Since 2008, the Korea Communications Standards Commission (KOCSC)25 has performed this role under the Network Act, which prohibits the distribution of illegal material, such as pornography, defamation, and online stalking. The content examined by the KOCSC includes material that could be harmful to children as well as content types stipulated in the Network Act and other legislation.

The Deliberation Rules on Information and Communications, which set out in detail the types of content that require deliberation, also list a wide range of content that the KOCSC considers harmful, rather than illegal. As a result of its deliberation, the commission may request corrective measures, such as the deletion or blocking of the material or the suspension or termination of a user’s account. Although such measures are technically only a recommendation, they are in effect mandatory. This is because the KCC can issue a correction order, which is mandatory, if the KOCSC’s recommendation is not accepted. In the case of material that leaks state secrets, violates the National Security Law, is intended for criminal purposes, or aids or abets a crime, if the KOCSC’s request for correction is not followed, the KCC must issue a correction order.

In 2022, the KOCSC reviewed 248,130 cases,26 of which 234,263 were determined to require corrective action. The most common type of corrective request was to block access to illegal or harmful information from overseas, which occurred in 192,621 cases. There were 21,867 decisions to disable or suspend users’ accounts and 19,378 decisions to delete the content. By type of violation, material that constituted an online sexual offense was the most common, with 54,994 cases (23.5 percent of the total), followed by gambling content, with 53,177 cases (22.7 percent), and obscene content, with 46,195 cases (19.7 percent). Material that violated the National Security Law was found in 2,071 cases (4.2 percent).27

Korean civil society groups have criticized the KOCSC’s deliberations as state censorship. Indeed, in a 2018 report, the UN special rapporteur on freedom of opinion and expression noted that some countries require the blocking of foreign websites and content that are deemed illegal under domestic law, which can lead to serious violations of the freedom of expression. The report found that “states should refrain from adopting models of regulation where government agencies, rather than judicial authorities, become the arbiters of lawful expression.”28 In the case of content that causes serious damage to a specific individual, such as digital sexual violence, there may be an urgent need to delete or block the material. However, targeting content that is deemed to undermine societal interests can stifle criticism of power and violate the freedom of expression.

In particular, content related to North Korea has been easily subjected to regulation. For example, on March 24, 2016, the KOCSC blocked access to North Korea Tech,29 a website that specializes in ICT issues in North Korea, citing violations of the National Security Act. In response, Martyn Williams, the operator of the site, with the support of OpenNet Korea, filed an administrative lawsuit, and a court of first instance ruled that blocking the entire website was illegal as it violated the principle of minimum regulation. In an appeal decision on October 18, 2017, the Seoul High Court upheld the decision of the court of first instance and dismissed the appeal.30

ENCRYPTION: GRADE B

Debates about encryption policies usually revolve around whether such policies should make it easier for government agencies to crack ciphers and to force private companies to crack them. Korea does not have such a policy, nor does it regulate the development and adoption of cryptographic technologies in the private sector. This is quite puzzling given the country’s tense relationship with North Korea and the fact that the NIS has considerable surveillance power, which it has abused in the past.

The NIS Act authorizes the service to carry out security work on documents, materials, facilities, areas, and personnel pertaining to state secrets, while a related enforcement decree provides specific regulations on the management of cryptographic materials. In addition, the NIS verifies the safety and implementation suitability of cryptographic modules used to protect important information in the materials communicated by public institutions such as administrative agencies. While this system has some impact on private cryptography, it does not regulate the free development and adoption of cryptography in the private sector.

The Framework Act on Intelligent Informatization requires the government to “prepare measures to facilitate the development and use of cryptography technology and to ensure the safety of intelligent information services using cryptography technology.”31 The Korea Internet and Security Agency operates a website for the “vitalization of cryptography technology,”32 but this does not fall under any regulation on the use of cryptography technology. The Framework Act on Electronic Documents and Electronic Transactions, meanwhile, stipulates that “the government may restrict the use of encryption products and take necessary measures to access the original encrypted information or encryption technology if it deems it necessary for national security,”33 but this provision has not been controversial.

In November 2020, under the Moon administration, then justice minister Choo Mi-ae sparked controversy when she instructed her office to consider a bill that would allow the government to forcibly unlock a suspect’s smartphone for investigative purposes. Because her instruction came in the context of a conflict with Yoon, who was the prosecutor general at the time, conservatives pushed back against it. Progressives and liberals, who formed the Moon administration’s support base, were also critical of the move. Civil society objected as well, saying that such a bill might violate people’s fundamental rights. In the end, the bill did not move forward.

AI REGULATION: GRADE C

In 2016, a match of the board game Go between AlphaGo, an AI program developed by DeepMind, a subsidiary of Google, and Lee Se-dol, a professional player, shocked Korean society because Go is considered by many to be the most difficult board game in the world—and Lee lost. The event increased the interest in and adoption of various AI tools across Korean society, including chatbots, translation aids, recruitment tools, and social media algorithms. At the same time, there have been controversies related to the development and use of AI. The chatbot Lee Luda, launched in December 2020, was the first in Korea to raise the issue of discrimination and hate speech by AI; Lee Luda was shut down after three weeks. The PIPC conducted an investigation into the chatbot’s misuse of personal information and imposed a total of 103.3 million won (around $78,000) in fines and penalties on Scatter Lab, the developer.34

Controversies have also arisen over AI tools for public-sector recruitment. In 2020, civil society groups requested the disclosure of information on AI recruitment tools used by public institutions. These groups criticized the fact that the institutions had adopted AI tools from private companies without reviewing their problems and performance and that they did not have adequate data to answer complaints from parties affected by AI. Several advocacy groups called for the establishment of a system to ensure public institutions’ accountability. In addition, in 2021, it became known that the Ministry of Justice had about 170 million records containing facial recognition data and other information on Korean and foreign citizens that had been collected during immigration inspection processes.35 All of this data was shared with private companies for AI learning and algorithm verification—without the consent of the data subjects—for the purpose of upgrading the country’s immigration system.

Despite these controversies, discussions of how to regulate the risks of AI in Korean society are still at a rudimentary level. So far, government policies have focused on fostering the AI industry and emphasized AI ethics and self-regulation rather than active regulation. On December 17, 2019, the Moon administration released the National Strategy for Artificial Intelligence,36 which understood AI as a civilizational change and expressed the government’s intention to use the technology as an opportunity to develop Korea’s economy and solve social problems. The strategy presented future visions for the AI era in three areas, to be realized through nine substrategies and one hundred action tasks. One of the nine substrategies, bold regulatory innovation, has as its guiding principle “allow first, regulate later.”

On December 23, 2020, the Ministry of Science and ICT and the Korea Information Society Development Institute released the AI Ethical Standards,37 which aim to achieve “AI for humanity” through three basic principles to be observed in the development and use of AI: human dignity, the public good of society, and technology that is fit for purpose.

To practice and implement the three basic principles, the standards set out ten core requirements that must be met in the entire process of AI development and use: a guarantee of human rights, protection of privacy, respect for diversity, a prohibition on infringement, public good, solidarity, data management, accountability, safety, and transparency. In an accompanying press release, the government stated that the standards were “not binding ‘laws’ or ‘guidelines’ but rather moral norms and voluntary codes.”38

Then, on May 13, 2021, the Ministry of Science and ICT released the Strategy for Realizing Trusted AI, which, unlike previous national strategies, focused on concerns about AI. The strategy recognized that it is difficult for AI to be accepted socially and industrially without societal trust in the technology. The strategy proposed three substrategies and ten action tasks for trusted AI while focusing on building a support system to secure reliability in a voluntary way in the private sector.39

A year later, the National Human Rights Commission released the Human Rights Guidelines for the Development and Use of AI to prevent human rights violations and discrimination that may occur in the process of developing and using AI. The commission is currently developing tools to support AI developers and deployers as they conduct human rights impact assessments.

In early 2023, the National Assembly and the Ministry of Science and ICT pushed for a basic law on AI. The bill presented was a consolidation of bills previously proposed by various members of the assembly. When it became known in February 2023 that the bill had passed the committee review stage, civil society groups strongly objected. These groups argued that the Ministry of Science and ICT, which was the lead ministry for the bill, was not an appropriate supervisory body for AI as it prioritizes industrial development over risk management.

In addition, the bill’s approach of “allow first, regulate later” has raised concerns that it could permit high-risk AI applications to enter the market and undermine human rights and safety regulations. The bill defines high-risk AI, but according to many critics, the definition does not include enough areas; and unlike the EU’s AI Act, the Korean bill does not define which forms of AI should be banned. The bill does provide some compliance requirements for AI in high-risk areas, but these are not sufficient to mitigate risks and are ineffective because there are no penalties for noncompliance.

Civil society is not opposed in principle to a bill to regulate AI but believes that the current proposal does not include sufficient safeguards to mitigate the risks of the technology. More discussion is needed on this issue as well as the questions of how high-risk AI should be defined, which usages should be prohibited, and which ministries should be responsible for overseeing AI.

NOTES

1 International Telecommunication Union, “Measuring Digital Development: ICT Development Index 2023,” 2023, https://www.itu.int/itu-d/reports/statistics/IDI2023.

2 International Telecommunication Union, Measuring the Information Society Report 2017: Executive Summary,” 2017, https://www.itu.int/dms_pub/itu-d/opb/ind/D-IND-ICTOI-2017-SUM-PDF-E.pdf.

3 “Guidelines for De-identification of Personal Data,” South Korean Ministry of the Interior and Safety, June 30, 2016, https://www.privacy.go.kr/cmm/fms/FileDown.do?atchFileId=FILE_000000000827480&fileSn=1.

4 “시민단체, 고객정보 3억4천여만 건 무단결합한 비식별화 전문기관 및 20개 기업 고발,” Korean Progressive Network ‘Jinbonet,’ press release, November 9, 2017, https://act.jinbo.net/wp/33555.

5 “Recital 26 of the General Data Protection Regulation,” Intersoft Consulting, European Union, accessed February 14, 2024, https://www.privacy-regulation.eu/en/recital-26-GDPR.htm.

6 “Results of the 3rd Regulatory and Institutional Innovation Hackathon,” Presidential Committee on the Fourth Industrial Revolution, April 6, 2018, http://webarchives.pa.go.kr/19th/www.4th-ir.go.kr/pressRelease/detail/57?category=report.

7 “Data regulation innovation, a blueprint came out. – On 11.15, amendments to three laws related to personal information protection were proposed in the National Assembly,” Ministry of the Interior and Safety, November 22, 2018, https://www.mois.go.kr/frt/bbs/type010/commonSelectBoardArticle.do?bbsId=BBSMSTR_000000000008&nttId=67218.

8 Several user rights advocates were outspoken in their opposition. For example, see “개인정보 도둑법 강행하는 정부 규탄한다,” Jinbonet, December 9, 2019, https://act.jinbo.net/wp/41952.

9 “Personal Information Protection Act,” Korea Law Translation Center, March 14, 2023, https://elaw.klri.re.kr/eng_mobile/viewer.do?hseq=62389&type=part&key=4.

10 “Personal Information Protection Act,” Korea Law Translation Center.

11 “Personal Information Protection Act – All Reasons for Enactment/Revision,” Korean Law Information Center, March 14, 2023, https://law.go.kr/LSW/lsRvsRsnListP.do?lsId=011357&chrClsCd=010202&lsRvsGubun=all/.

12 “개인정보 판매와 공유를 허용하는 개인정보보호법 반대한다!,” Jinbonet, November 21, 2018, https://act.jinbo.net/wp/40024.

13 “개인정보 판매와 공유를 허용하는 개인정보보호법 반대한다!,” Jinbonet.

14 “Scientific research” in English can be translated as “학술 연구” or “과학적 연구” in Korean. “Academic research” in this chapter, or “학술 연구,” is research that takes place primarily within the academic community, often requires peer review, and whose results are shared with society and contribute to the expansion of society’s knowledge base. “Scientific research” in this chapter, or “과학적 연구,” may be used in a similar sense to “academic research,” but it has strong nuances as research using scientific methods.

15 “Recital 159, Processing for Scientific Research Purposes,” Intersoft Consulting, European Union, accessed February 14, 2024, https://gdpr-info.eu/recitals/no-159.

16 “A Preliminary Opinion on Data Protection and Scientific Research,” European Data Protection Supervisor, January 6, 2020, https://edps.europa.eu/sites/edp/files/publication/20-01-06_opinion_research_en.pdf.

17 “Study on How to Improve the Personal Information Protection Legal System in Accordance With International Human Rights Standards Such as the European Union General Data Protection Regulation (GDPR),” Institute for Digital Rights, November 16, 2020, 71–72, https://idr.jinbo.net/673.

18 According to the GDPR, “profiling” refers to any form of automated processing of personal data to evaluate certain personal aspects relating to a person, in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.” See “Art. 4 GDPR: Definitions,” Intersoft Consulting, European Union, accessed February 20, 2024, https://gdpr-info.eu/art-4-gdpr.

19 “S Korea Nuclear Firm to Hold Cyber-Attack Drills After Hack,” BBC, December 22, 2014, https://www.bbc.com/news/world-asia-30572575.

20 “Security” in English translates to “보안” or “안보” in Korean. If “안보” has a strong meaning of national security, “보안” is used in a neutral and technical context. For example, cybersecurity includes the security of individual computers and network systems, but the state can also be a factor that can threaten individual security. However, cybersecurity in terms of national security mainly emphasizes the threat of external attackers (especially North Korea in the Korean context). The National Cyber Security Strategy announced in 2019 used the term “보안.” In other words, it focuses on cyber strategy in terms of national security. Cybersecurity naturally includes a national security context, but the national security aspect has been particularly emphasized in Korea, and this is one factor that distorts cybersecurity-related policies and social discourse in Korea.

21 Bong-jin Choi, “Can a National Security Agency That Tried to Turn Back the Clock Really Make a Difference?,” OhmyNews, May 30, 2017, https://m.ohmynews.com/NWS_Web/Mobile/at_pg.aspx?CNTN_CD=A0002329933.

22 Kim Oi-hyun et. al, “NIS Hacking Targeted South Korean Nationals in China,” Hankyoreh, July 22, 2015, https://english.hani.co.kr/arti/english_edition/e_national/701313.

23 Hee-seok Yoon, “[100대 사건_077] 대규모 개인정보 유출 사고 <2008년 2월>,” ETNews, September 17, 2012, https://www.etnews.com/201209110625.

24 Personal Information Portal, South Korean Personal Information Protection Commission, https://www.privacy.go.kr.

25 Korea Communications Standards Commission, https://www.kocsc.or.kr.

26 Figures are from the KOCSC’s 2022 Annual Report (in Korean). See “Annual Report 2022,” Korea Communications Standards Commission, 2023,

http://www.kocsc.or.kr/commons/pdfViewer/web/viewer.html?file=/upload/main/bbs/info_Casebook_main/BBS_202306270353117001#page=1&zoom=auto,-16,745.

27 “2022 Yearbook of Broadcasting and Communications Review,” Korea Communications Standards Commission, 2023, http://www.kocsc.or.kr/commons/pdfViewer/web/viewer.html?file=/upload/main/bbs/info_Casebook_main/BBS_202306270353117001#page=1&zoom=auto,-16,745.

28 “Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression,” UN General Assembly, April 6, 2018, Para 68, https://undocs.org/Home/Mobile?FinalSymbol=A%2FHRC%2F38%2F35&Language=E&DeviceType=Desktop&LangRequested=False.

29 North Korea Tech, https://www.northkoreatech.org.

30 KOCSC’s blocking of North Korean ICT information media North Korea Tech was later found unlawful by the High Court. See “Court of Appeals Confirmed That the Blocking of ‘North Korea Tech’ Website Is Unlawful,” Opennet, October 23, 2017, https://www.opennetkorea.org/en/wp/2208.

31 “Framework Act on Intelligent Informatization,” Korea Law Translation Center, June 9, 2020, article 57(2) https://elaw.klri.re.kr/eng_mobile/viewer.do?hseq=54720&type=part&key=43.

32 Vitalization of Cryptography Technology, https://seed.kisa.or.kr.

33 “Framework Act on Electronic Documents and Electronic Transactions,” Korea Law Translation Center, June 1, 2012, article 14(2), https://elaw.klri.re.kr/eng_mobile/viewer.do?hseq=27334&type=part&key=28.

34 “PIPC Imposes Sanctions Such as Fines and Penalties on Scatter Lab, Developer of ‘Lee Luda,’” Personal Information Protection Commission, April 28, 2021, https://www.pipc.go.kr/np/cop/bbs/selectBoardArticle.do?bbsId=BS074&mCode=C020010000&nttId=7298&fbclid=IwAR3SKcMQi6G5pR9k4I7j6GNXtc8aBVDOwcURevvvzQtYI7AS40UKYXoOXo8.

35 Cheon Ho-seong, “Government Hands Over 170 Million Immigration Mugshots to AI Firm,” Hankyoreh, October 21, 2021, https://www.hani.co.kr/arti/economy/it/1016022.html.

36 “Artificial Intelligence (AI) National Strategy Released,” Ministry of Science and ICT, December 17, 2019, https://www.korea.kr/briefing/pressReleaseView.do?newsId=156366736.

37 “MSIT Establishing the AI Ethics Standards,,” Ministry of Science and ICT, December 23, 2020, https://www.korea.kr/briefing/pressReleaseView.do?newsId=156428773.

38 “MSIT Establishing the AI Ethics Standards,” Ministry of Science and ICT.

39 “Announcing a Strategy for Implementing Trusted AI,” Ministry of Science and ICT, May 13, 2021, https://www.korea.kr/briefing/pressReleaseView.do?newsId=156451595.