TransWikia.com

Seeking full list of ISO ALPHA-2 and ISO ALPHA-3 country codes?

Geographic Information Systems Asked by Lampard on April 15, 2021

I’m searching for a full list of countries with their country codes.

Like on this page (need full and valid):

http://www.nationsonline.org/oneworld/country_code_list.htm

18 Answers

The ISO 3166-1 official site is probably the most updated source for the two-letter codes. Unfortunately, they don't have the alpha-3 online, quoting their site:

Where can I find the ISO 3166-1 alpha-3 country code for free download on the ISO 3166/MA Website?

Nowhere. The alpha-3 code is not made available free of charge. You can buy the International Standard ISO 3166-1 from our ISO Store. It contains the three-letter code.

A bit strange in the internet era, but luckily, there is a Wikipedia article with the full list and a UN official document that covers the subject, with country codes.

Update:

There's a list at the CIA site with FIPS 10, ISO 3166 Alpha2, ISO 3166 Alpha3, STANAG and Internet TLD (e.g, .il or .uk).

Link summary:

Note that these list contain non-country entities like Antartica.

Correct answer by Adam Matan on April 15, 2021

You can find all (most?) of the two and three letter codes in http://download.geonames.org/export/dump/countryInfo.txt - it also has ISO numeric and fips codes and other country info.

Answered by Ian Turton on April 15, 2021

If you want to periodically update your list, you could scrape one of the sources and parse its results into a useful format. I've done so here for converting the Wikipedia country code list into a CSV:

import csv
import urllib2
from BeautifulSoup import BeautifulSoup

opener = urllib2.build_opener()
opener.addheaders = [('User-agent', 'Mozilla/5.0')]

url = 'http://en.wikipedia.org/wiki/ISO_3166-1'

page = opener.open(url)
soup = BeautifulSoup(page.read())

# "Current Codes" is second table on the page
t = soup.findAll('table', {'class' : 'wikitable sortable'})[1]

# create a new CSV for the output
iso_csv = csv.writer(open('wikipedia-iso-country-codes.csv', 'w'))

# get the header rows, write to the CSV
iso_csv.writerow([th.findAll(text=True)[0] for th in t.findAll('th')])

# Iterate over the table pulling out the country table results. Skip the first 
# row as it contains the already-parsed header information.
for row in t.findAll("tr")[1:]:
    tds = row.findAll('td')
    raw_cols = [td.findAll(text=True) for td in tds]
    cols = []
    # country field contains differing numbers of elements, due to the flag -- 
    # only take the name
    cols.append(raw_cols[0][-1:][0])
    # for all other columns, use the first result text
    cols.extend([col[0] for col in raw_cols[1:]])
    iso_csv.writerow(cols)

Answered by scw on April 15, 2021

You may use this code https://classic.scraperwiki.com/scrapers/iso_3166-1/edit/ -- lxml is always faster than BeautifulSoup.

Copied it here:

import scraperwiki
import lxml.html
import urllib
import datetime
import json

from unidecode import unidecode

def get_html(title):
    raw_json = scraperwiki.scrape("http://en.wikipedia.org/w/api.php?action=parse&format=json&page=" + title)
    html = json.loads(raw_json)['parse']['text']['*']
    return html

page_title = "ISO_3166-1"

html = get_html(page_title)
doc = lxml.html.fromstring(html)

for count, tr in enumerate(doc.cssselect('tr')):
    row = [(td.text_content()) for td in tr.cssselect('td')]
    if len(row)==5:
        for ahref in tr.cssselect('a'):
            detailink = ahref.attrib['href']
            if detailink.find(':',0,len(detailink)) != -1:
                detailink = detailink[6:]
                print detailink
        now = datetime.datetime.now()
        data ={"tmsp_scraped":str(now), "eng_short_name":row[0], "alpha_2_code":row[1], "alpha_3_code":row[2], "numeric_code":row[3], "iso_31662_code":detailink}
        scraperwiki.sqlite.save(unique_keys=["eng_short_name"], data=data, table_name="s_iso31661")

        html = get_html(detailink)
        doc = lxml.html.fromstring(html)

        for count, tr in enumerate(doc.cssselect('tr')):
            row = [td.text_content() for td in tr.cssselect('td')]
            row2 = [td.text_content() for td in tr.cssselect('td')]
            if len(row)>0:
                if row[0][:2] == detailink[11:]:
                    now = datetime.datetime.now()
                    data = {"tmsp_scraped":str(now), "iso_31662_code":detailink, "region_code":row[0], "region_desc":row[1], "region_desc_utf8":row2[1]}
                    scraperwiki.sqlite.save(unique_keys=["iso_31662_code","region_code"], data=data, table_name="s_iso31662_region")

One more nice library: https://github.com/neuront/python-iso3166

Answered by user21707 on April 15, 2021

Try this list:

https://gist.github.com/eparreno/205900

It has ISO 2-letter, 3-letter and numeric codes with country short form names.

Answered by Mark Micallef on April 15, 2021

I would like to add pycountry because you have a python tag and it seems to be what you wanted. From the docs:

ISO country, subdivision, language, currency and script definitions and their translations

pycountry provides the ISO databases for the standards:

639 Languages

3166 Countries

3166-3 Deleted countries

3166-2 Subdivisions of countries

4217 Currencies

15924 Scripts

The package includes a copy from Debian's pkg-isocodes and makes the data accessible through a Python API.

Answered by einSelbst on April 15, 2021

Answered by Lummo on April 15, 2021

a php Array with 3 letter ISO country codes from Wikipedia article

I copy and pasted the list from Wikipedia and created the array. Maybe this code can help somebody save some time, that wants to create an array of country codes. I am not familiar with python, but array creation should be similar to php.

$Countries=array();

array_push($Countries,"ABW");
array_push($Countries,"AFG");
array_push($Countries,"AGO");
array_push($Countries,"AIA");
array_push($Countries,"ALA");
array_push($Countries,"ALB");
array_push($Countries,"AND");
array_push($Countries,"ARE");
array_push($Countries,"ARG");
array_push($Countries,"ARM");
array_push($Countries,"ASM");
array_push($Countries,"ATA");
array_push($Countries,"ATF");
array_push($Countries,"ATG");
array_push($Countries,"AUS");
array_push($Countries,"AUT");
array_push($Countries,"AZE");
array_push($Countries,"BDI");
array_push($Countries,"BEL");
array_push($Countries,"BEN");
array_push($Countries,"BES");
array_push($Countries,"BFA");
array_push($Countries,"BGD");
array_push($Countries,"BGR");
array_push($Countries,"BHR");
array_push($Countries,"BHS");
array_push($Countries,"BIH");
array_push($Countries,"BLM");
array_push($Countries,"BLR");
array_push($Countries,"BLZ");
array_push($Countries,"BMU");
array_push($Countries,"BOL");
array_push($Countries,"BRA");
array_push($Countries,"BRB");
array_push($Countries,"BRN");
array_push($Countries,"BTN");
array_push($Countries,"BVT");
array_push($Countries,"BWA");
array_push($Countries,"CAF");
array_push($Countries,"CAN");
array_push($Countries,"CCK");
array_push($Countries,"CHE");
array_push($Countries,"CHL");
array_push($Countries,"CHN");
array_push($Countries,"CIV");
array_push($Countries,"CMR");
array_push($Countries,"COD");
array_push($Countries,"COG");
array_push($Countries,"COK");
array_push($Countries,"COL");
array_push($Countries,"COM");
array_push($Countries,"CPV");
array_push($Countries,"CRI");
array_push($Countries,"CUB");
array_push($Countries,"CUW");
array_push($Countries,"CXR");
array_push($Countries,"CYM");
array_push($Countries,"CYP");
array_push($Countries,"CZE");
array_push($Countries,"DEU");
array_push($Countries,"DJI");
array_push($Countries,"DMA");
array_push($Countries,"DNK");
array_push($Countries,"DOM");
array_push($Countries,"DZA");
array_push($Countries,"ECU");
array_push($Countries,"EGY");
array_push($Countries,"ERI");
array_push($Countries,"ESH");
array_push($Countries,"ESP");
array_push($Countries,"EST");
array_push($Countries,"ETH");
array_push($Countries,"FIN");
array_push($Countries,"FJI");
array_push($Countries,"FLK");
array_push($Countries,"FRA");
array_push($Countries,"FRO");
array_push($Countries,"FSM");
array_push($Countries,"GAB");
array_push($Countries,"GBR");
array_push($Countries,"GEO");
array_push($Countries,"GGY");
array_push($Countries,"GHA");
array_push($Countries,"GIB");
array_push($Countries,"GIN");
array_push($Countries,"GLP");
array_push($Countries,"GMB");
array_push($Countries,"GNB");
array_push($Countries,"GNQ");
array_push($Countries,"GRC");
array_push($Countries,"GRD");
array_push($Countries,"GRL");
array_push($Countries,"GTM");
array_push($Countries,"GUF");
array_push($Countries,"GUM");
array_push($Countries,"GUY");
array_push($Countries,"HKG");
array_push($Countries,"HMD");
array_push($Countries,"HND");
array_push($Countries,"HRV");
array_push($Countries,"HTI");
array_push($Countries,"HUN");
array_push($Countries,"IDN");
array_push($Countries,"IMN");
array_push($Countries,"IND");
array_push($Countries,"IOT");
array_push($Countries,"IRL");
array_push($Countries,"IRN");
array_push($Countries,"IRQ");
array_push($Countries,"ISL");
array_push($Countries,"ISR");
array_push($Countries,"ITA");
array_push($Countries,"JAM");
array_push($Countries,"JEY");
array_push($Countries,"JOR");
array_push($Countries,"JPN");
array_push($Countries,"KAZ");
array_push($Countries,"KEN");
array_push($Countries,"KGZ");
array_push($Countries,"KHM");
array_push($Countries,"KIR");
array_push($Countries,"KNA");
array_push($Countries,"KOR");
array_push($Countries,"KWT");
array_push($Countries,"LAO");
array_push($Countries,"LBN");
array_push($Countries,"LBR");
array_push($Countries,"LBY");
array_push($Countries,"LCA");
array_push($Countries,"LIE");
array_push($Countries,"LKA");
array_push($Countries,"LSO");
array_push($Countries,"LTU");
array_push($Countries,"LUX");
array_push($Countries,"LVA");
array_push($Countries,"MAC");
array_push($Countries,"MAF");
array_push($Countries,"MAR");
array_push($Countries,"MCO");
array_push($Countries,"MDA");
array_push($Countries,"MDG");
array_push($Countries,"MDV");
array_push($Countries,"MEX");
array_push($Countries,"MHL");
array_push($Countries,"MKD");
array_push($Countries,"MLI");
array_push($Countries,"MLT");
array_push($Countries,"MMR");
array_push($Countries,"MNE");
array_push($Countries,"MNG");
array_push($Countries,"MNP");
array_push($Countries,"MOZ");
array_push($Countries,"MRT");
array_push($Countries,"MSR");
array_push($Countries,"MTQ");
array_push($Countries,"MUS");
array_push($Countries,"MWI");
array_push($Countries,"MYS");
array_push($Countries,"MYT");
array_push($Countries,"NAM");
array_push($Countries,"NCL");
array_push($Countries,"NER");
array_push($Countries,"NFK");
array_push($Countries,"NGA");
array_push($Countries,"NIC");
array_push($Countries,"NIU");
array_push($Countries,"NLD");
array_push($Countries,"NOR");
array_push($Countries,"NPL");
array_push($Countries,"NRU");
array_push($Countries,"NZL");
array_push($Countries,"OMN");
array_push($Countries,"PAK");
array_push($Countries,"PAN");
array_push($Countries,"PCN");
array_push($Countries,"PER");
array_push($Countries,"PHL");
array_push($Countries,"PLW");
array_push($Countries,"PNG");
array_push($Countries,"POL");
array_push($Countries,"PRI");
array_push($Countries,"PRK");
array_push($Countries,"PRT");
array_push($Countries,"PRY");
array_push($Countries,"PSE");
array_push($Countries,"PYF");
array_push($Countries,"QAT");
array_push($Countries,"REU");
array_push($Countries,"ROU");
array_push($Countries,"RUS");
array_push($Countries,"RWA");
array_push($Countries,"SAU");
array_push($Countries,"SDN");
array_push($Countries,"SEN");
array_push($Countries,"SGP");
array_push($Countries,"SGS");
array_push($Countries,"SHN");
array_push($Countries,"SJM");
array_push($Countries,"SLB");
array_push($Countries,"SLE");
array_push($Countries,"SLV");
array_push($Countries,"SMR");
array_push($Countries,"SOM");
array_push($Countries,"SPM");
array_push($Countries,"SRB");
array_push($Countries,"SSD");
array_push($Countries,"STP");
array_push($Countries,"SUR");
array_push($Countries,"SVK");
array_push($Countries,"SVN");
array_push($Countries,"SWE");
array_push($Countries,"SWZ");
array_push($Countries,"SXM");
array_push($Countries,"SYC");
array_push($Countries,"SYR");
array_push($Countries,"TCA");
array_push($Countries,"TCD");
array_push($Countries,"TGO");
array_push($Countries,"THA");
array_push($Countries,"TJK");
array_push($Countries,"TKL");
array_push($Countries,"TKM");
array_push($Countries,"TLS");
array_push($Countries,"TON");
array_push($Countries,"TTO");
array_push($Countries,"TUN");
array_push($Countries,"TUR");
array_push($Countries,"TUV");
array_push($Countries,"TWN");
array_push($Countries,"TZA");
array_push($Countries,"UGA");
array_push($Countries,"UKR");
array_push($Countries,"UMI");
array_push($Countries,"URY");
array_push($Countries,"USA");
array_push($Countries,"UZB");
array_push($Countries,"VAT");
array_push($Countries,"VCT");
array_push($Countries,"VEN");
array_push($Countries,"VGB");
array_push($Countries,"VIR");
array_push($Countries,"VNM");
array_push($Countries,"VUT");
array_push($Countries,"WLF");
array_push($Countries,"WSM");
array_push($Countries,"YEM");
array_push($Countries,"ZAF");
array_push($Countries,"ZMB");
array_push($Countries,"ZWE");

Answered by MrWeix on April 15, 2021

There's a great dataset over at Open Knowledge Foundation too which includes ISO 3166 alpha3, alpha2, numeric as well as many others.

http://data.okfn.org/data/core/country-codes#data

https://github.com/datasets/country-codes

Answered by mjs2020 on April 15, 2021

If you don't want to hard code the country list (which I don't recommend, because it changes a lot) use this URL from which you get the 2-letter code and country name in JSON format: annsystem.com/api/getCountry

It also includes UN and non-UN member countries.

For details and parameters see here: flossk.org/en/blog/country-list-good-all

Answered by user1319829 on April 15, 2021

I updated @scw's script which scrapes the data from Wikipedia. It now uses requests instead of urllib2, Beautiful Soup 4 and it outputs a JSON instead of writing to a CSV file.

import json
import bs4
import requests

print(json.dumps(
    [
        {
            ['name', 'alpha_2', 'alpha_3', 'numeric'][no]:
            td.find_all()[-1].text
            for no, td in enumerate(row.find_all('td')[:-1])
        }
        for row in bs4.BeautifulSoup(
            requests.get('http://en.wikipedia.org/wiki/ISO_3166-1').text
        ).find('table', {'class': 'wikitable sortable'}).find_all('tr')[1:]
    ],
    indent=4,
    ensure_ascii=False
))

Outputs a JSON like:

[
    {
        "name": "Afghanistan",
        "alpha_3": "AFG",
        "alpha_2": "AF",
        "numeric": "004"
    },
    {
        "name": "Åland Islands",
        "alpha_3": "ALA",
        "alpha_2": "AX",
        "numeric": "248"
    },

    ...

Answered by gitaarik on April 15, 2021

On many Linux distributions, a list of iso country codes is installed by default under:

/usr/share/xml/iso-codes/iso_3166.xml

Under Fedora/CentOS/RHEL/Debian, the package that contains this file is called iso-codes (project homepage).

The XML file contains the mapping in a hierarchical structure:

<iso_3166_entries>
    <iso_3166_entry
            alpha_2_code="AF"
            alpha_3_code="AFG"
            numeric_code="004"
            name="Afghanistan"
            official_name="Islamic Republic of Afghanistan" />
[..]

It can be transformed into a record based format (e.g. for data base import) via XPath and a shell one liner:

$ xmllint --noout --xpath 
     '//iso_3166_entry/@*[name() = "alpha_2_code" or name()="alpha_3_code"]' 
     /usr/share/xml/iso-codes/iso_3166.xml 
    | sed 's/alpha_2/nalpha_2/g' 
    | awk -F'"' 'OFS="," {print $2,$4}'

Alternatively, one can use the Python module pycountry to read and transform the codes from that package, e.g.:

$ pip3 install --user pycountry
$ python3
>>> import pycountry
>>> for i in pycountry.countries:
...   print('{},{}'.format(i.alpha2,i.alpha3))
...
AF,AFG
AX,ALA
AL,ALB
[..]

Answered by maxschlepzig on April 15, 2021

In case any R users stumble upon this thread, here's the R solution:

The countrycode package contains a full list of country codes in many different formats. From the package documentation:

Supports the following coding schemes: Correlates of War character, CoW-numeric, ISO3-character, ISO3-numeric, ISO2-character, IMF numeric, International Olympic Committee, FIPS 10-4, FAO numeric, United Nations numeric, World Bank character, official English short country names (ISO), continent, region.

The package will also convert between different codes and can identify countries by standard or non-standard names using regular expressions.

library(countrycode)
# data frame of country names and codes
head(countrycode_data)
# convert from CoW to ISO3
countrycode(c("ALG","ALB","UKG","CAN","USA"), origin = "cowc", destination = "iso3c")
# ISO2 code from non-standard name
countrycode("Britain", "country.name", "iso2c")

Answered by Matt SM on April 15, 2021

Simply Use Microsoft Excel Power BI tools to extract the data from Wikipedia. It takes les than 30 seconds to create an excel scrape of the page and then save it to whatever format you like.

Answered by Christian McGhee on April 15, 2021

This is an old thread, but its worth updating for this.

Forward/Reverse look ups on Alpha2 and Alpha3 country codes, returns a massive object per country that includes phone codes, currency, iso info, IOC info, postal codes, and more: https://github.com/rolinger/iso-country-data-validation

Answered by rolinger on April 15, 2021

I found very nice database on the github repo - https://github.com/stefangabos/world_countries

At the moment of writing repository consist of json, csv, sql files for 22 languages with different country codes: ISO 3166-1 alpha-3, ISO 3166-1 alpha-2 and the full names.

Database seem to be updated quite regularly

Answered by ruX on April 15, 2021

Get the API from World Bank.

They offer responses both in XML and JSON

For further information refer their developer page: https://datahelpdesk.worldbank.org/knowledgebase/articles/898590-country-api-queries

Answered by hiFI on April 15, 2021

Updated one of the original answers to urllib3 and bs4

# https://gis.stackexchange.com/questions/1047/seeking-full-list-of-iso-alpha-2-and-iso-alpha-3-country-codes
# If you want to periodically update your list, you could scrape one of the sources and parse its results into a
# useful format. I've done so here for converting the Wikipedia country code list into a CSV:

import csv
import urllib3
from bs4 import BeautifulSoup

url = 'http://en.wikipedia.org/wiki/ISO_3166-1'
http = urllib3.PoolManager()
response = http.request('GET', url)
print(f"responce: {response.status}")
print(f"received: {len(response.data)} bytes")

soup = BeautifulSoup(response.data, features="html.parser")

# "Current Codes" is second table on the page
t = soup.findAll('table', {'class' : 'wikitable sortable'})[1]

# create a new CSV for the output
iso_csv = csv.writer(open('ripped_wikipedia-iso-country-codes.csv', 'w'), quoting=csv.QUOTE_MINIMAL)

# get the header rows, write to the CSV
iso_csv.writerow([th.findAll(text=True)[0].strip() for th in t.findAll('th')])

# Iterate over the table pulling out the country table results. Skip the first
# row as it contains the already-parsed header information.
for row in t.findAll("tr")[1:]:
    tds = row.findAll('td')
    raw_cols = [td.findAll(text=True) for td in tds]
    # first take the name
    cols = [raw_cols[0][1]]
    # for all other columns, use the first result text, and loose the lines shifts
    cols.extend([col[0].strip() for col in raw_cols[1:]])
    iso_csv.writerow(cols)

Answered by Martin on April 15, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP