Drupal Fire - Quick Roundup from important Drupal blogs and sites
Air pollution is a leading cause of death across the globe, and contributes to stroke, heart disease, lung cancer, and other respiratory illness. While the vast majority of deaths are in low and middle income countries, air quality continues to worsen in cities across the world.
Giving citizens the power to measure their own air quality is one way to turn the tide. With accurate, low-cost sensors, people can measure the concentration of harmful particulate matter in their homes or places of work. With several sensors scattered across an area, citizens create meaningful data that they can use to advocate for better policy, zoning laws, and regulation.
We’ve been working with a group of hardware engineers, infrastructure builders, and journalists to develop an air quality monitoring system powered by an open API and low-cost sensors. Earth Journalism Network (EJN) and Internews designed, manufactured, and deployed Dustduinos, Arduino-based sensors that detect particulate matter at 2.5µm and 10µm. The Dustduino uses an open spec that was optimized for low power consumption and SMS communication and wifi.
Our role together with Feedback Labs, FrontlineSMS, GroundTruth, EJN, and Internews was to turn the raw input data into actionable information. We built a data pipeline to make this data publicly available through a flexible API as well as for download. The API allows anyone to build apps on top this information, or integrate it with other tools. Downloads will allow researchers and advocacy organizations to work with the data in tabular format.
To ensure connectivity in areas without wifi, FrontlineSMS, a company already doing great work bridging SMS-enabled devices and citizen users, augmented their services to support the Dustduino.
Open and available air quality data can empower citizens in vulnerable areas to have more say in the policies that affect their local air quality as well as providing researchers with valuable insight into potentially understudied areas. This kind of change happens from the ground up. We will continue to support this change by building open-source data pipelines, better sensors, and a robust community for open air quality data.
I gave my State of Drupal presentation at DrupalCon Los Angeles in front of 3,000+ attendees. In case you didn't attend DrupalCon Los Angeles, you can watch the recording of my keynote or download a copy of my slides (PDF, 77 MB).
In the first part of the keynote, I talked about the history of the Drupal project, some of the challenges we overcame, and some of the lessons learned. While I have talked about our history in the past, it had been 6 years ago at DrupalCon Washington DC in 2009. In those 6 years, the Drupal community has grown so large that most people in the community don't know where we came from. Understanding the history of Drupal is important; it explains our culture, it holds us together in challenging times and provides a compass for where we are heading.
In the middle part of the keynote, I talked about what I believe is one of our biggest challenges; motivating more organizations to contribute more meaingfully to Drupal's development. Just as it is important to understand the history of Drupal, talking about the present is an important foundation for everyone in the community. It is hard to grow without the context of our current state.
In the third and last part of the keynote, I looked forward, talked about my vision for the big reverse of the web and how it relates to Drupal. The way the web is evolving provides us an opportunity to better understand our sites visitors or users and to build one-to-one relationships, something that much of our society has lost with the industrial revolution. If the web evolves the way I think it will, it will be both life changing and industry changing. While it won't be without concerns, we have a huge opportunity ahead of us, and Drupal 8 will help us build towards that future.
I'm proud of where we came from and excited for where we are headed. Take a look at the keynote if you want to learn more about it.
Even if it is a wild risk, like blowing $1 on a lottery ticket, but more traditionally by putting personal or business capital at risk.
This is not really talking about owning or managing a company, so much as it is a state of mind. A person who thinks of them as an independent contractor, dedicated to a specific business may treat their work with more regard than a traditional "employee."
I noted this statement and didn't reference it to anyone, so I'd like to think I came to this conclusion on my own.
Thoughts about what makes the difference between a gas station attendant and a millionaire.
Jeff talks to Deane Barker of Blend Interactive about the art and practice of content management, the joy of solving complicated problems, and his upcoming O'Reilly book Web Content Management.
Last week, some colleagues from Cocomore and I attended DrupalCamp Spain 2015. Spanish Drupal community is awesome, and they have put all their efforts in making an unforgettable event again in this 6th edition (the 5th I have attended).
The event was divided into different activities for the three days: Business Day and Sprints on Friday, and sessions on Saturday and Sunday.
Robin Tolochko loves maps. Her favorite map is a 1868 map of South America. We know that because it’s on her resume.
Robin shares her map love. She directed a mapping lab in Bogota and teaches others to make maps at Maptime Madison. Come see her lightning talk on Mapathons at State of the Map US.
At Development Seed, Robin will make beautiful maps that are intuitive and purposeful. Robin brings curiosity and a detail-oriented eye to all the work she does. She’s committed to advancing women’s rights and renewable energy. She also owns a small business that sells handmade leather goods from Colombia.
On April 21, 2015, Google rolled out a set of changes to its search algorithm so sweeping it dubbed them "Mobilegeddon." Together, these updates dramatically boosted the impact of a site’s "mobile-friendliness" on its search rankings. Google says the changes will have "significant impact in our search results", though at least for now it only affects search results on mobile devices.
Today we’re releasing a beta version of OpenAerialMap. OpenAerialMap makes it easy to share and find open satellite and drone imagery. This is critical to the work of the disaster response community. We are launching this tool in close partnership with the Humanitarian OpenStreetMap Team (HOT).
OpenAerialMap is a set of tools for searching, sharing, and using open satellite and drone imagery. This initial release includes the core infrastructure to catalog petabytes of open imagery. It also includes an extremely usable API and an elegant web interface to submit, search and download available imagery.
Search for available imagery.
Select scenes by grid.
Preview imagery and get metadata information.
Rebooting a great concept
The OpenAerialMap concept has bounced between several attempts over half a decade. Previous attempts failed to take off. HOT reinitiated the concept this year with funding from the Humanitarian Innovation Fund. We worked with the HOT and others in the open imagery community to reimagine an approach to OpenAerialMap that we expect to be much more successful.
- We focused on simple, usable toolset that meets the clear needs of the humanitarian response community. The underlying architecture is flexible enough to be immediately useful to research, resource management, urban planning, and other communities. However, we decided to first build a frictionless interface for the clear needs of the disaster response community.
- We are extremely focused on community. From day one, we involved other organizations and developers like Azevea, Planet Labs, Cadasta, OpenDroneMap and HOT’s own developers.
- With these and other groups, we reimagined OpenAerialMap along a network model. Rather than try to house all the open imagery out there, OpenAerialMap is a node and index for a larger network or open imagery - the Open Imagery Network.
A network approach to open imagery
Open Imagery Network (OIN) is a simple framework and license for placing imagery into an open source license. Participants in the OIN adopt a common metadata scheme to describe the imagery they are making available, and standardized ways to broadcast and access that data. This allows us to build tools that search across all open imagery data without requiring one entity to host all of it. We’re working with HOT, Planet Labs, Cadasta, Azavea, OpenDroneMap, and others to develop OIN and to build OpenAerialMap as the first node in that network.
Community input is critical to the development of OpenAerialMap. Let us know what you think about the beta version and send us feedback on Twitter or open an issue on Github. We’ll work to add functionality and features toward an initial release candidate later this summer.
The Geospatial World Forum is well under way in Lisbon and we’re impressed with the diversity of people and talks at the event.
On Thursday night, we are organizing an Open Geospatial Happy Hour at Café Fábulas with our friends from Planet Labs. This Happy Hour is a great opportunity to meet people and talk about open data, mapping and satellites in a more informal setting.
You don’t have to be attending the conference to join the happy hour, just let us know if you’re coming. You can RSVP here.
If you are participating in the GFW, make sure to check out our workshop and talk in the exhibition hall on Thursday morning.
Nearly 100 million barrels of oil flow through the global oil supply chain every day. But not all oils are created equal. When you consider the full oil processing lifecycle, some types of oil are responsible for nearly twice as much greenhouse gas as others. This is important information for oil procurement and energy policy. Smarter oil selection can lead to significant reduction in greenhouse gases without even touching overall oil consumption. We worked with the Carnegie Endowment to launch the Oil Climate Index website to help consumers and policymakers make smarter decisions on oil.
Extracting Oil Data
Oil emission data has traditionally been extremely hard to find. The source data for most oil fields is often secret and some of the models needed to calculate total greenhouse gas emissions have been proprietary. The Carnegie Endowment made a major contribution to understanding the climate impacts of fossil fuels with the Oil-Climate Index. The Oil Climate Index is the first study using entirely open-source models for evaluating greenhouse gas emission. These models were developed in a collaboration by Carnegie Endowment’s Energy and Climate Program, Stanford University, and the University of Calgary. In addition, the Oil Climate Index also collected model input data for 30 popular and emerging oils.
The data is extremely complex and nuanced. Depending on what you want you want to do with an oil, making diesel vs jet fuel, different oils may be better. An improvement in an extraction or refinement process may have a significant climate benefit for one oil but not for another. To make this data useful to scientists, investors, policy makers, and interested citizens we built a flexible data exploration tool. The tool makes reasonable assumptions to allow immediate comparison, but also allows users to explore how specific factors change the overall climate impact of each oil. Most importantly, all of the data and the modeling methodologies are open and available for download.
Designing for complexity
We limited graphing options to the most meaningful properties to provide flexibility without unnecessary complexity. To visualize the Oil-Climate Index, we weren’t just showing static data, we were visualizing results of complex models. Processing all the data in the browser is impractical. Instead, we picked several model input parameters that are most meaningful and calculated their data up front. We built processing tools behind the site to recalculate this data as Carnegie extracts data on new oils.
For more information on the Oil-Climate Index hit us on Twitter, ask @DxGordon, or check out these resources: