Friends of transparency, good governance and open data - it's time to activate!
Friends of transparency and good governance – it’s time to activate!
The regulation implementing the groundbreaking anti-corruption law, the Cardin-Lugar Provision of the Dodd Frank Act (Section 1504), is in critical danger of being repealed by the House and Senate. This law allows citizens in resource rich countries to ‘follow the money’ and hold their governments accountable for graft, waste and abuse.
This regulation is supported by civil society groups around the world, investors with nearly $10 trillion in assets under management, government officials and nearly all major oil, gas and mining companies.
Please, call your Senators' district and DC offices using the phone numbers in the spreadsheets below and make your outrage known.
If you need some help, here is what you can say when you call:
I am calling because I am concerned citizen from [State]. I have heard that Congress is attempting to undo an important oil and gas anti-corruption regulation, called Section 1504 of Dodd-Frank, using the Congressional Review Act.
This regulation, also known as the Cardin-Lugar Provision, is meant to give citizens like me information about if we are getting a good deal on our nation's natural resources. It also protects our national energy and security interests by decreasing corruption abroad.
After trying to dismantle the Congressional Office of Ethics, it makes no sense for Congress to go after ANOTHER anti-corruption provision.
I want to know if you (your boss) plans on voting AGAINST any efforts to undo this critical provision?"
Thank you for your support!
Scroll to the bottom to see additional actions you can take through our partner organizationsCheck out our page on RulesAtRisk.org for more information
Priority Senate Offices
Other Ways to Take Action on the Cardin-Lugar Anti-Corruption Rule:
In our first video training session, we presented a walkthrough of how to organize USEITI data for use in the open source mapping software QGIS. Fortunately, that dataset included geographic identifiers called Federal Information Processing Standard (FIPS) county codes--five digit codes identifying counties and county equivalents throughout the United States. However, not every dataset will include a geographic identifier alongside data attributed to a location. Google Refine is a powerful and versatile tool that can allow users to clean, manipulate, and transform their data. In this post we will walk through the process of using Google Refine to add geographic coordinates to a dataset.
Step 1 - Download and Install Google Refine
Navigate to the OpenRefine download page, and download Google Refine 2.5 for your operating system. Google Refine operates as a hybrid desktop and web application. When you run Google Refine, a browser window should open automatically and present you with the Google Refine web interface. Despite operating within a web browser window, Google Refine does not require an active internet connection to work. As long as the Google Refine application is running, you can navigate to http://127.0.0.1:3333/ to access the web interface.
Before we move to the next step, take a moment to download the following .csv file. This dataset was downloaded from ResourceProjects.org, and was reduced to only include 2015 projects carried out by Tullow Oil. Google Refine is a powerful piece of software, however, it can quickly get bogged down with very large sets of data. This file was limited to one company for the purposes of this tutorial.
Step 2 - Upload your dataset to Google Refine
To get started, click ‘Create Project’. You will be presented with a number of options for data inputs. We will create a new project using data from ‘this computer.’ Select the file downloaded in the step above, and click next to start the process of uploading the dataset.
Step 3 - Add a new column to fetch location information
With the dataset uploaded, Google Refine will present a preview of the entries. Review the data and headers to make sure everything appears as it should. At the bottom of the window check that the ‘Parse next’ box is ticked so that the first row entries are parsed as column headers.
Click the ‘create project’ button in the upper right corner to proceed to the main working space of Google Refine. As noted above, we will be adding in additional geographic information to this dataset. To do so, click the triangle in the ‘Paid to’ column and navigate to ‘Edit column’ > ‘Add column by fetching URLs…’
A window will pop up as shown below. Name the column and enter in the following text into the ‘Expression’ box. (Click here to learn more about General Refine Expression Language)
Click ‘OK’ and the expression will produce a column containing what is essentially the output of a search of the google maps application programming interface (API) on the basis of each term in the ‘Paid to’ column. This operation will typically take several minutes to complete depending on the size of the dataset. While you wait for the process to complete you can experiment to get a better of sense of how this function works. Enter the expression we just used, leaving off the last portion, into the address bar of another browser window:
Fill in the name of any location around the world after the “=” and you will see a page with all the relevant location information for that location. This should give you a better sense of what is happening under the hood with the fetching URLs function in Google Refine.
Step 4 - Add another column to parse the information from the previous step
Once the process has completed, you will see a column filled with a long string of text and numbers.
To clean this up we will add another column parsing through that data. Click on the triangle in the new column you created in Step 3 containing all the Google maps information, and select ‘Edit column’ > ‘Add column based on this column…’ Write in a title for this new column and enter in the following text into the ‘Expression’ box:
Click ‘OK’ and the new column will populate with a neat seat of latitude and longitude coordinates separated by a comma derived from the data in the column we produced in Step 3.
STEP 5 - Export your project
The final step is to click ‘Export’ in the upper right corner of the Google Refine window. Select ‘Comma-separated value’ or ‘Excel’ from the dropdown list of file types.
You can then open the exported file in a desktop application to delete the column containing the unparsed location information while leaving the second column we created that includes the latitude and longitude coordinates. Google Refine is ideal for refining, cleaning and adding to a dataset, but operations like deleting rows and columns should be done in programs like Excel.
While this post demonstrates how latitude and longitude coordinates can be derived from a country name, the exact same process can be carried out for any other location. If instead of country names the dataset contained the names of cities or provinces, the same steps can be used to obtain the latitude and longitude coordinates. Location information can help you to create persuasive maps and other visualizations of your data. To learn more about what can be done with extractives data and mapping, navigate to the training section of Extract-A-Fact.
Always on the lookout for interesting data, I was excited when I recently came across a comprehensive trove of data on offshore production in the Gulf of Mexico from the Bureau of Ocean Energy Management (BOEM).The datasets at data.boem.gov include:
However, because the data is interspersed between four datasets, it is downright complicated to find out which leases a company owns, what the lease attributes are, and how much oil and gas has been produced from the lease. With that in mind, I have created an interactive map application that combines the geographic data, the ownership data, and the production data into one easy-to-use tool.
See the full-sized map here.
This map allows the user to view:
Take a look at the map, hosted on our Github page at the following link: https://pwypusa.github.io/pages/gulf_explorer.html
Click around and let us know what you find out, here or on Twitter @PWYPUSA!
There are two types of disclosures. One is disclosure for the sake of transparency, while the other is disclosure that actually works for the people it is intended to help. Ensuring the latter is the philosophy Bantay Kita has applied to its engagement with natural resources data.
When the Philippine's Extractive Industries Transparency Initiative (PHEITI) Country Report was first published in 2014, an incredible amount of data was made available to the public. As a civil society representative to the PHEITI Multi Stakeholder Group, Bantay Kita was tasked with making sure that the data was used to facilitate greater accountability.
On July 2016, Bantay Kita soft launched its DATA Portal - short for Demanding Action, Transparency, and Accountability Portal. The portal was conceptualized in Jakarta, Indonesia during the first Publish What You Pay Data Extractors Program workshop, and later brought to life in the second Data Extractors workshop in Harare, Zimbabwe.
How much did it cost? $0
What programming language was used? I'm not even sure - I don't know a single one.
How long did it take to make the prototype? Minus the snack breaks and random Facebook checking, about four hours.
All it really took was some creativity and a handful of free web tools, which can be learned by users in 1 to 6 hours, depending on the person’s willingness to learn.
The DATA Portal is community-targeted, hence it uses project-level, provincial and regional data related to oil, gas and mining extraction. Data was analyzed from the Philippine EITI Country Report and other sources, such as the Mines and Geosciences Bureau.
The portal includes data for all 18 administrative regions in the Philippines. Each "Regional Page" has subpages for its Extractive Projects Database, News about Extractives, and Extractives Statistics. On top of this, there are "Company Pages" where data for individual projects can be found, such as production numbers, government payments, employee demographics, social and environmental spending, community demographics, poverty incidence, transparency measure, and so on.
There are digital metric tons of data available on extractives, but not all are relevant to specific communities. Since our soft launching in July, we have been traveling the Philippines to conduct open data workshops and collect locally translated versions of "data user templates." We ask local CSOs to fill them out, with a specific advocacy goal they are working on in mind, and ask what kind of data would best help them to influence decision makers and other stakeholders to support their advocacy goal. This makes the data we produce not only specific to a certain community, but also relevant to them.
What's quite unique about the DATA Portal is that it's not simply a box full of big datasets and tables of data, but rather data visualizations and infographics to make the information easier to understand. This makes the the numbers less intimidating (especially for those traumatized by college algebra).
The next step for the DATA Portal is to make Action and Accountability happen. To that end, we hope to begin to provide communities with quarterly data - production and sales - disclosed by extractive companies through SMS. Since most taxes and royalties are based on sales, this information will enable local communities to estimate what subnational transfers from mining activities their communities should receive, and thus enable them to plan for the following year. This information can also be used by indigenous peoples’ organizations that receive royalties. Though schedule of payments can vary, at a minimum these disclosures will allow communities to validate the accuracy of the received funds.
Making a difference takes more than making data open. Stakeholders need to make these disclosures relevant to communities to actually make an impact. Through our DATA Portal, Bantay Kita hopes to do just that.
Marco Zaplan is the Research and Communications Officer for Bantay Kita. Follow him @zaplanmarco
Click here for the archives to see our full list of posts.