Why Data Sharing is Important: A Summary from WEMC’s Data SIG Public Webinar
Electrification is a very good driver to decarbonize the energy sector, in order to limit and decrease carbon emissions in the next decades. This requires in particular an increase in power generation from renewables sources, including hydro, wind and solar power. As a consequence, the energy sector dependence on meteorological parameters, climate variability and climate change will increase. Meteorology has a key role to play to improve energy planning and operations at all timescales, from long term resource assessment to risk management and generation operation and dispatch.
A critical aspect towards improving renewables forecasts and their integration into the grid is the use and possible exchange of data among various actors, to e.g. set up, calibrate and evaluate forecasting models. Currently, many specialists agree that data format, standard and access is a limiting factor in the development of tools and activities in the area of energy & meteorology.
This public webinar presented by WEMC’s Data Exchange, Access and Standards Special Interest Group (Data SIG), led by Laurent Dubus (EDF) and Sue Haupt (NCAR), communicated why data sharing is important to improve renewable energy forecasts, how data exchange is organized in meteorology and what could be done in the energy industry to assist with a faster progress in integrating more renewable energy into the grid. Laurent Dubus and Sue Haupt were joined by Mikkel Westenholz, Managing Director of ENFOR, a Danish provider of energy forecasting and optimization solutions for the energy sector, and Lars Peter Riishojgaard, Project Manager of the World Meteorological Organization’s Integrated Global Observing System (WIGOS).
The recording of the webinar can be viewed here, with summaries of key points from each of the contributors below.
Why data sharing is important
Sue Haupt (NCAR) believes that sharing data can aid situational awareness and enable prediction for the good of all. Her perspective comes from having built forecasting systems both for operational uses as well as doing research, and experience has shown Sue that it ‘takes a community’ beginning with the end user, all the way back to the research community and funding organizer, for effective and productive data sharing.
What really needs to happen to connect the research to the end user is some sort of operations translation communication.
Sue gave an example of a project involving a partnership comprising of public practice, private and academic partners, where the end users communicated what they needed. The United State Department of Energy ‘Sunshot’ program provided funding to NCAR, several other government labs, universities to do the research on how to forecast better. A complicated system was generated that tested multiple ways of doing forecasting, but required observations and data throughout the system. This data not only came from satellites and specialized instruments like sky imagers but also ground-based meteorological observations as well as local observations that came from the utilities and ISOs that were involved in the project. Having that type of data is necessary to enable a high quality well functioning system. To add, extra data from sites, not only the meteorological data but also data on power is critical to good forecasts.
Observations lead the way – Fred Carr, American Meteorological Society 2017
A challenge that arises from such a range and volume of data, is that it comes in very disparate forms.
Some are point data, some are gridded data, some are pixel data. There are large volumes of data. Those of you who have downloaded satellite data or full NWP data have had to figure out how to winnow that down to a manageable amount of data.
Another issue Sue notices when working with data is inconsistent timestamps, for example whether UTC has been used versus a local time. And if in local time, does it include changes for daylight savings in those parts of the world that go to daylight savings time? Time zone issues are important. For example, in the United States some utilities span multiple time zones and it has been observed that the time zone on the data depends on which time zone it has downloaded in rather than the location of a particular plant. These challenges amongst others demonstrate it is critical to have standardized data.
Not addressing data issues can increase costs, raise frustrations and degrades the quality of forecasts. Amongst the benefits of doing so are the aiding of data simulation training, AI algorithms, helping to initialize real time algorithms and verify models and helping with the smooth real-time operation of wind and solar plants (by knowing information on temperatures, exact cloud cover, etc). It is clear that by bringing in all these types of data in the meteorological community and especially in the applications in the energy interest industry, and now with the ‘internet of things’ (something Sue and NCAR have working on for a while) there is an opportunity to bring in large amounts of real time data and integrate it in ways that provide decision support.
Sue and her colleagues at NOAA have performed parallel research that demonstrate when multiple data sources are utilised from, say, wind farms, then publicly available forecasts are better. This benefits the entire community including the renewable energy community.
[There are] positives of data sharing when we have data and assimilate them into NWP models. Various studies including some that NCAR has done, have shown that the NWP model can do better when we have very local data.
Sue finished by presenting some recommendations. For instance, utilities should start sharing their data with national centers so that the national centers can improve their forecast for everybody, including the energy industry. Historical data needs to be saved as it is valuable for training machine learning algorithms. Data needs to be recorded in standardized formats such as using UTC timestamps to bring them in line with meteorological timescales, and having more data is going to improve the situation for all stakeholders, especially end-users in the energy industry.
Data communication standards for operational renewable energy forecasting
In his session, Mikkel Westenholz (ENFOR) addressed a methodology for how to ease communication between forecast users and forecast providers and develop a common terminology.
From Mikkel’s point of view, the current situation of data terminologies and communication between forecast providers is both time consuming and inefficient. There is no standard way of exchanging data, and each set-up that between providers and users requires a need to define terminology as part of the process to initiate a data exchange. Not only does this come with the inherent risk of misunderstanding, but the necessity for a clarification process is time consuming and repetitive, particularly when it comes to the setting up of the data communication between the forecast providers and the forecast users.
In the future, Mikkel hopes that standardized definitions and data exchanges will be the situation. In particular, the freeing-up of time will provide clear benefits.
Time can be spent on understanding more complex and atypical business requirements. We can spend time modeling and improving accuracy of forecast and instead of data integration and clarifying terminology. Ultimately the goal is to be able to provide a forecast with high quality at lower cost and making renewables more competitive…
The initiative involving ENFOR and other partners is taking a pragmatic approach to creating a structure and consistency in the terminology, utilising two levels of standardisation. To aid the development process, forecast providers and forcast users (mainly energy companies and utilities) are invited to join the work. Mikkel calls for support with a structured process for developing, reviewing and releasing new versions of standards, but also to review existing and related standards to see if inspiration can be found. This work is also part of a sub-task on standardization for IEA Task 36 (http://www.ieawindforecasting.dk), and there will be coordination with other similar initiatives.
Mikkel gave an overview of the current status of the initiative, including giving further information about the logical layer and metadata standards, but gave an example about the level of flexibility currently available in the methodology.
Those who are interested getting involved with this initiative are encouraged to contact Mikkel at miw@enfor.dk.
How the weather and climate community is organized for data exchange and standards
Lars Peter Riishojgaard gave a World Meteorological Organization (WMO) perspective on why observational data exchange is needed, and the role that WMO plays.
Lars demonstrated the need for data exchanges by visualising, on a map of the globe, the different spatial scales of data observations needed for different forecast time scales. For instance, to make an effective 1-day forecast within the United States of America, observational data needs to cover the whole North American region, someway west into the Pacific and east a little into the Atlantic. However, forecasts that look 2-4 days ahead need data observations that almost cover the entire northern hemisphere and a little in the southern hemisphere. Forecasts looking 5-7 days ahead, which are now common, you have what Lars describes as a “global observational problem”.
It doesn’t matter whether you sit in Beijing or in Washington or in Sydney, you need the same observational data. Even if you care about the forecast only over a very specific local area, you essentially need observations from all over the globe… Meteorology knows no boundaries… a corollary… in meteorology [is that] ignorance knows no boundaries either.
Lars clearly states there’s nothing new in this. This fact was behind the creation of the World Weather Watch (WWW) Program around 1960. It is a “fantastic example” of international collaboration for more than 50 years, demonstrating that meteorologists cannot do their work without such teamwork.
A key element of the effectiveness of this collaboration has been the Global Observing System (GOS). The WMO itself does not take measurements, but instead, a very large mix of 193 WMO members consisting of national weather services, marine services, space agencies etc. WMO meanwhile are the regulatory body to ensure standardization with elements such as what is being measured, where, how often and how that information is being exchanged internationally. Using reasonable assumptions and simple calculations, Lars states if instead, agreements to do this were done bilaterally between individual states, it would require close to 20,000 individual agreements. This would be “spectacularly inefficient”.
Tying into Sue’s presentation earlier in the webinar, Lars gave a brief overview of how the WMO helps to ‘softly’ facilitate regional exchange of observations through their Regional Basic Observing Network (RBON). A more recent approach with “more teeth” is the Global Basic Observing Network (GBON), which has mandatory special resolution requirements and a mandatory reporting frequency. Given both the importance and usefulness of sharing observations as described by Lars earlier, it would be intuitive to think there was no need to monitor the network of global observations as everyone can see the benefit. Unfortunately, this is not the case. Too few of the stations that are monitored actually report all that is required at the temporal frequency set out; those are located in Europe, Japan, eastern parts of South America and the east coast of Australia.
[This] amounts to a tremendous amount of lost opportunity for the WMO members to do a better job in terms of issuing forecasts, warnings and climate analysis and products.
The aim of GBON is to address this and not only improve poorly reporting areas and bring them in-line with reporting areas such as Europe and Japan, but also fill in areas where there is no data reporting at all, requiring funding support from entities such as the World Bank. An estimated US$350m of additional capital investment is needed with US$150m in annual operating costs. These figures are actually not that high in comparison to what the World Bank and Green Climate Fund are already spending on such issues.
Selected questions and answers
Responses paraphrased/edited, and full responses can be accessed by listening to the webinar recording.
Q: Is data privacy also an issue which needs to be addressed?
A (Sue Haupt): Yes. It is suggested that there could be a forum where if you’re contributing private data, perhaps it goes into a national center who can be a trusted third party who protects those data, make sure that the data itself does not become available widely, but that it does get used well in making forecasts better. And we realize that in working with the private sector that privacy really is an issue. Non-disclosure agreements could be signed with the national centers that you’re providing them to.
A (Lars Peter Riishojgaard): From a WMO prospective, it is an extremely important issue because of the realization of the enormous economic importance of the meteorological data products. Some private entities are not only concerned about not giving away their data, they are actively lobbying their governments to stop our international data sharing practices because they perceive that entities like WMO are wounding a potential market that they could develop for selling observations. It’s not just a question of privacy, it is also a question about the value of information and we will be forced to undertake an overhaul of our data policies in the interest of the private sector.
Q: How do you deal with data quality in a standard way?
A (Lars Peter Riishojgaard): WMO has been setting standards for many years in terms of how and where things should be measured. The modern philosophy behind WIGOS is that we tend to be much more inclusive, meteorological observations used to be done by technically knowledgeable people with somewhat specialized instrumentation. This was done by national weather services and relatively easy to standardize but now that paradigm is likely going away. Most of us have smartphones with barometers in them and they also have GPS, so if you get a pressure measurement and a three-dimensional coordinate location, that is a valuable meteorological observation. Many of us drive cars that have thermometers and rain sensors in them, and with internet connectivity those are also mobile observing platforms. Those are just two extreme examples. There’s just a heterogeneity of information out there available that is not really amenable to any standardization, but we can still make use of it. So the modern practice is that we follow a very strict metadata standard. So everything in principle can be included in WIGOS as long as we get the requisite data to go along with it. And then it is up to the user to make an informed decision. So we tend to focus a lot more on the metadata standard and the metadata repository rather than really enforcing a standard on the observations themselves.
Thank you to Laurent Dubus for organising and leading this webinar, and thank you to Sue Haupt, Mikkel Westenholz and Lars Peter Riishojgaard for sharing their insights and expertise.
What is the Data Exchanges, Access and Standards Special Interest Group (WEMC Data SIG)?
Data is key to design, build and validate effective products and services for the energy sector. Meteorological data has reached a high level of quality and availability due to the worldwide collaboration among national meteorological services and intergovernmental organisations, under the leadership of World Meteorological Organization (WMO). Energy data are not at the same level of achievement, as no unique approach, standards, formats has yet been established.
This SIG aims to evaluate and develop best practices and standards for the collection, structure and exchange of relevant data; web-based standard, open, inter-operable platforms for the dissemination of data; funding solutions for-long-term development and maintenance of databases; and intellectual property issues.
For more information about WEMC’s Special Interest Groups (SIGs) click here.
About WEMC
Established in 2015, the World Energy & Meteorology Council (WEMC) is a non-profit organisation based at the University of East Anglia in Norwich, UK. WEMC’s key aim is to promote and enhance the interaction between the energy industry and the weather, climate and broader environmental sciences community. WEMC is led by Managing Director, Professor Alberto Troccoli. For regular updates on WEMC activity, including our International Conference Energy & Meteorology (ICEM), visit www.wemcouncil.org, follow us on Twitter @WEMCouncil and LinkedIn, or sign up for our newsletter.
WEMC Membership
WEMC offers free membership to professionals in the energy, meteorology and related sectors. Benefits include exclusive access to information and resources in our Members Area, access to our Special Interest Groups, opportunities to contribute to WEMC publications, and the chance to connect with leading research and industry experts. Become a WEMC Member.
Edited & published by Kit Rackley, 29 August 2019