Drupal Fire - Quick Roundup from important Drupal blogs and sites
Last week, at the amazing Drupal North regional conference, I gave a talk on Backdrop: an alternative fork of Drupal. The slides from the talk are attached below, in PDF format.
Last week, at the amazing Drupal North regional conference, I gave a talk on Backdrop: an alternative fork of Drupal. The slides from the talk are attached below, in PDF format.
A little over a year ago we launched the Acquia Certification Program for Drupal. We ended up the first year with close to 1,000 exams taken, which exceeded our goal of 300-600. Today, I'm pleased to announce that the Acquia Certification Program passed another major milestone with over 1,000 exams passed (not just taken).
People have debated the pros and cons of software certifications for years (including myself) so I want to give an update on our certification program and some of the lessons learned.
Acquia's certification program has been a big success. A lot of Drupal users require Acquia Certification; from the Australian government to Johnson & Johnson. We also see many of our agency partners use the program as a tool in the hiring process. While a certification exam can not guarantee someone will be great at their job (e.g. we only test for technical expertise, not for attitude), it does give a frame of reference to work from. The feedback we have heard time and again is how the Acquia Certification Program is tough, but fair; validating skills and knowledge that are important to both customers and partners.
We also made the Certification Magazine Salary Survey as having one of the most desired credentials to obtain. To be a first year program identified among certification leaders like Cisco and Red Hat speaks volumes on the respect our program has established.
Creating a global certification program is resource intensive. We've learned that it requires the commitment of a team of Drupal experts to work on each and every exam. We know have four different exams: developer, front-end specialist, backend specialist and site builder. It roughly takes 40 work days for the initial development of one exam, and about 12 to 18 work days for each exam update. We update all four of our exams several times per year. In addition to creating and maintaining the certification programs, there is also the day-to-day operations for running the program, which includes providing support to participants and ensuring the exams are in place for testing around the globe, both on-line and at test centers. However, we believe that effort is worth it, given the overall positive effect on our community.
We also learned that benefits are an important part to participants and that we need to raise the profile of someone who achieves these credentials, especially those with the new Acquia Certified Grand Master credential (those who passed all three developer exams). We have a special Grand Master Registry and look to create a platform for these Grand Masters to help share their expertise and thoughts. We do believe that if you have a Grand Master working on a project, you have a tremendous asset working in your favor.
At DrupalCon LA, the Acquia Certification Program offered a test center at the event, and we ended up having 12 new Grand Masters by the end of the conference. We saw several companies stepping up to challenge their best people to achieve Grand Master status. We plan to offer the testing at DrupalCon Barcelona, so take advantage of the convenience of the on-site test center and the opportunity to meet and talk with Peter Manijak, who developed and leads our certification efforts, myself and an Acquia Certified Grand Master or two about Acquia Certification and how it can help you in your career!
Queries are the centerpiece of MySQL and they have high optimization potential (in conjunction with indexes). This is specially true for big databases (whatever big means). Modern PHP frameworks tend to execute dozens of queries. Thus, as a first step, it is required to know what the slow queries are. A built-in solution for that is the MySQL slow query log. This can either be activated in my.cnf or dynamically with the --slow_query_log option. In both cases, long_query_time should be reduced to an appropriate value.
The web was born as an open, decentralized platform allowing different people in the world to access and share information. I got online in the mid-nineties when there were maybe 100,000 websites in the world. Google didn't exist yet and Steve Jobs had not yet returned to Apple. I remember the web as an "open web" where no one was really in control and everyone was able to participate in building it. Fast forward twenty years, and the web has taken the world by storm. We now have a hundreds of millions of websites. Look beyond the numbers and we see another shift: the rise of a handful of corporate "walled gardens" like Facebook, Google and Apple that are becoming both the entry point and the gatekeepers of the web. Their dominance has given rise to major concerns.
We call them "walled gardens" because they control the applications, content and media on their platform. Examples include Facebook or Google, which control what content we get to see; or Apple, which restricts us to running approved applications on iOS. This is in contrast to the "open web", where users have unrestricted access to applications, content and media.
Facebook is feeling the heat from Google, Google is feeling the heat from Apple but none of these walled gardens seem to be feeling the heat from an open web that safeguards our privacy and our society's free flow of information.
This blog post is the result of people asking questions and expressing concerns about a few of my last blog posts like the Big Reverse of the Web, the post-browser era of the web is coming and my DrupalCon Los Angeles keynote. Questions like: Are walled gardens good or bad? Why are the walled gardens winning? And most importantly; how can the open web win? In this blog post, I'd like to continue those conversations and touch upon these questions.
Are "walled gardens" good or bad for the web?
What makes this question difficult is that the walled gardens don't violate the promise of the web. In fact, we can credit them for amplifying the promise of the web. They have brought hundreds of millions of users online and enabled them to communicate and collaborate much more effectively. Google, Apple, Facebook and Twitter have a powerful democratizing effect by providing a forum for people to share information and collaborate; they have made a big impact on human rights and civil liberties. They should be applauded for that.
At the same time, their dominance is not without concerns. With over 1 billion users each, Google and Facebook are the platforms that the majority of people use to find their news and information. Apple has half a billion active iOS devices and is working hard to launch applications that keep users inside their walled garden. The two major concerns here are (1) control and (2) privacy.
First, there is the concern about scale and control. These organizations shape the news that most of the world sees. When too few organizations control the media and flow of information, we must be concerned. They are very secretive about their curation algorithms and have been criticized for inappropriate censoring of information.
Second, they record data about our behavior as we use their sites (and the sites their ad platforms serve) inferring information about our habits and personal characteristics, possibly including intimate details that we might prefer not to disclose. Every time Google, Facebook or Apple launch a new product or service, they are able to learn a bit more about everything we do and control a bit more about our life and the information we consume. They know more about us than any other organization in history before, and do not appear to be restricted by data protection laws. They won't stop until they know everything about us. If that makes you feel uncomfortable, it should. I hope that one day, the world will see this for what it is.
While the walled gardens have a positive and democratizing impact on the web, who is to say they'll always use our content and data responsibly? I'm sure that to most critical readers of this blog, the open web sounds much better. All things being equal, I'd prefer to use alternative technology that gives me precise control over what data is captured and how it is used.
Why are the walled gardens winning?
Why then are these walled gardens growing so fast? If the open web is theoretically better, why isn't it winning? These are important questions about future of the open web, open source software, web standards and more. It is important to think about how we got to a point of walled garden dominance, before we can figure out how an open web can win.
The biggest reason the walled gardens are winning is because they have a superior user experience, fueled by data and technical capabilities not easily available to their competitors (including the open web).
Unlike the open web, walled gardens collect data from users, often in exchange for free use of a service. For example, having access to our emails or calendars is incredibly important because it's where we plan and manage our lives. Controlling our smartphones (or any other connected devices such as cars or thermostats) provides not only location data, but also a view into our day-to-day lives. Here is a quick analysis of the types of data top walled gardens collect and what they are racing towards:
On top of our personal information, these companies own large data sets ranging from traffic information to stock market information to social network data. They also possess the cloud infrastructure and computing power that enables them to plow through massive amounts of data and bring context to the web. It's not surprising that the combination of content plus data plus computing power enables these companies to build better user experiences. They leverage their data and technology to turn “dumb experiences” into smart experiences. Most users prefer smart contextual experiences because they simplify or automate mundane tasks.
Can the open web win?
I still believe in the promise of highly personalized, contextualized information delivered directly to individuals, because people ultimately want better, more convenient experiences. Walled gardens have a big advantage in delivering such experiences, however I think the open web can build similar experiences. For the open web to win, we first must build websites and applications that exceed the user experience of Facebook, Apple, Google, etc. Second, we need to take back control of our data.
Take back control over the experience
The obvious way to build contextual experiences is by combining different systems that provide open APIs; e.g. we can integrate Drupal with a proprietary CRM and commerce platform to build smart shopping experiences. This is a positive because organizations can take control over the brand experience, the user experience and the information flow. At the same time users don't have to trust a single organization with all of our data.
The current state of the web: one end-user application made up of different platform that each have their own user experience and presentation layer and stores its own user data.
To deliver the best user experience, you want “loosely-coupled architectures with a highly integrated user experience”. Loosely-coupled architectures so you can build better user experiences by combining your systems of choice (e.g. integrate your favorite CMS with your favorite CRM with your favorite commerce platform). Highly-integrated user experiences so can build seamless experiences, not just for end-users but also for content creators and site builders. Today's open web is fragmented. Integrating two platforms often remains difficult and the user experience is "mostly disjointed" instead of "highly integrated". As our respective industries mature, we must focus our attention to integrating the user experience as well as the data that drives that user experience. The following "marketecture" illustrates that shift:
Instead of each platform having its own user experience, we have a shared integration and presentation layer. The central integration layer serves to unify data coming from distinctly different systems. Compatible with the "Big Reverse of the Web" theory, the presentation layers is not limited to a traditional web browser but could include push technology like a notification.
For the time being, we have to integrate with the big walled gardens. They need access to great content for their users. In return, they will send users to our sites. Content management platforms like Drupal have a big role to play, by pushing content to these platforms. This strategy may sound counterintuitive to many, since it fuels the growth of walled gardens. But we can't afford to ignore ecosystems where the majority of users are spending their time.
Control personal data
At the same time, we have to worry about how to leverage people's data while protecting their privacy. Today, each of these systems or components contain user data. The commerce system might have data about past purchasing behavior, the content management system about who is reading what. Combining all the information we have about a user, across all the different touch-points and siloed data sources will be a big challenge. Organizations typically don't want to share user data with each other, nor do users want their data to be shared without their consent.
The best solution would be to create a "personal information broker" controlled by the user. By moving the data away from the applications to the user, the user can control what application gets access to what data, and how and when their data is shared. Applications have to ask the user permission to access their data, and the user explicitly grants access to none, some or all of the data that is requested. An application only gets access to the data that we want to share. Permissions only need to be granted once but can be revoked or set to expire automatically. The application can also ask for additional permissions at any time; each time the person is asked first, and has the ability to opt out. When users can manage their own data and the relationships they have with different applications, and by extension with the applications' organizations, they take control over their own privacy. The government has a big role to play here; privacy law could help accelerate the adoption of "personal information brokers".
Instead of each platform having its own user data, we move the data away from the applications to the users, managed by a "personal information broker" under the user's control.
The user's personal broker manages data access to different applications.
People don't seem so concerned about their data being hosted with these walled gardens since they've willingly given it to date. For the time being, "free" and "convenient" will be hard to beat. However, my prediction is that these data privacy issues are going to come to a head in the next five to ten years, and lack of transparency will become unacceptable to people. The open web should focus on offering user experiences that exceed those provided by walled gardens, while giving users more control over their user data and privacy. When the open web wins through improved transparency, the closed platforms follow suit, at which point they'll no longer be closed platforms. The best case scenario is that we have it all: a better data-driven web experience that exists in service to people, not in the shadows.
In this week's Drupalize.Me podcast, hostess Amber Matz chats about all things Project Management with Seth Brown (COO at Lullabot) and Lullabot Technical Project Managers Jessica Mokrzecki and Jerad Bitner. To continue the conversation, check out Drupalize.Me's series on Project Management featuring interviews and insights from these fine folks and others at Lullabot.
Back in October, I received the Oculus Rift DK2 for my birthday and found what I think will be the future of how we build websites, interact with customers, and communicate with each other. I found a community of enthusiasts who are building virtual worlds with the very same concepts we use to build two-dimensional websites today. This community is pushing the boundaries of the web in a way that is unlike any other, and we’re having an absolute blast discovering the possibilities.
Ethan Marcotte and Karen McGrane discuss the highs and lows of large-scale responsive design projects, and explain how they targeted and promoted their series of responsive design workshops.
In this episode of Hacking Culture, Matthew Tift talks with Holly Ross, the Executive Director of the Drupal Association, about the Drupal community, the Drupal Association, non-profits, business, tax codes, and more. They get into some controversial issues, and some of Holly's answers may surprise you!
At yesterday's Worldwide Developer Conference keynote, Apple announced its annual updates to iOS, OS X, and the new watchOS. As usual, the Apple rumor blogs correctly predicted most of the important announcements weeks ago, but one important piece of news only leaked a few hours before the keynote: the launch of a new application called "News". Apple's News app press release noted: "News provides beautiful content from the world's greatest sources, personalized for you".
Apple basically cloned the Flipboard to create News. Flipboard was once Apple's "App of the Year" in 2010, and it remains one of the most popular reading applications on iOS. This isn't the first time Apple has chosen to compete with its ecosystem of app developers. There is even a term for it, called "Sherlocking".
But forget about Apple's impact on Flipboard for a minute. The release of the News app signifies a more important shift in the evolution of the web, the web content management industry, and the publishing industry.
Impact on content management platforms
Why is Apple's News app a big deal for content management platforms? It's a big deal because there are half a billion active iOS devices and Apple will ship its News app to every single one of them. It will accelerate the fact that websites are becoming less relevant as an end-point destination.
Some of the other new iOS 9 features will also add fuel to the fire. For example, Apple's search service Spotlight will also get an upgrade, allowing third-party services to work directly with Apple's search feature. Spotlight can now "deep link" to content inside of a website or application, further eliminating website or applications as end-points. You could search for a restaurant in Yelp directly from your home screen, and go straight to Yelp's result page without having to open the Yelp website or application. Add to that the Apple Watch which doesn't even ship with a web browser, and it's clear that Apple is about to accelerate the post-browser era of the web.
The secret to the News app is the new Apple News Format; rumored to be a RSS-like feed with support for additional design elements for image and video support, custom fonts, etc. Apple uses these feeds to aggregate content from different news sources, uses machine learning to match the best content to a given user, and provides a clean, consistent look and feel for articles coming from these various news sources. That is the long way of saying that Apple decides what the best content is for you, and what the best format is to deliver it in. It is a profound change, but for most people this will actually be a superior user experience.
When content is "pushed" to you by smart aggregators, using a regular browser doesn't make much sense. You benefit from a different kind of browser for the web. For content management platforms, it redefines the browser and websites as end-points; de-emphasizing the role of presentation while increasing the importance of structured content and metadata. Given Apple's massive install base, this will further accelerate the post-browser era of the web.
I don't know about your content management platform, but Drupal is ready for it. It was designed for a content-first mentality while many competitive content management systems continue to rely on a dated page-centric content model. It was also designed to be a content repository capable of outputting content in multiple formats.
The release of Apple News is further proof that data-driven experiences will be the norm and of what I have been calling The Big Reverse of the Web. The fact that for the web to reach its full potential, it will go through a massive re-architecture from a pull-based architecture to a push-based architecture. After the Big Reverse of the Web is complete, content will find you, rather than you having to find content. Apple's News and Flipboard are examples of what such push-based experiences looks like; they "push" relevant and interesting content to you rather than you having to "pull" the news from multiple sources yourself.
Impact on publishing industry
Forget the impact on Flipboard or the content management platforms, the impact on the publishing world will even be more significant. The risk for publishers is that they are being disintermediated as the distribution channel and that their brand become less useful. It marks a powerful transformation that could de-materialize and de-monetize much of the current web and publishing industry.
Because of Apple's massive installed base, Apple will now own a large part of the distribution channel and it will have an outsized influence on what hundreds of millions of users will read. If we've learned one thing in the short history of the Internet, it is that jumping over middlemen is a well-known recipe for success.
This doesn't mean that online news media have lost. Maybe it can actually save them? Apple could provide publishers large and small with an immense distribution channel by giving them the ability to reach every iOS user. Apple isn't alone with this vision, as Facebook recently rolled out an experiment with select publishers like Buzzfeed and the New York Times called Instant Articles.
In a "push economy" where a publisher's brand is devalued and news is selected by smart aggregators, the best content could win; not just the content that is associated with the most well-known publishing brands with the biggest marketing budgets. Publishers will be incentivized to create more high-quality content -- content that is highly customized to different target audiences, rather than generic content that appeals to large groups of people. Success will likely rely on Apple's ability to use data to match the right content to each user.
This isn't necessarily bad. In my opinion, the web isn't dead, it's just getting started. We're well into the post-PC era, and now Apple is helping to move consumers beyond the browser. It's hard to not be cautiously optimistic about the long-term implications of these developments.
Today I gave a lightning talk at State of the Map about how to organize a mapathon. A mapathon, also known as a mapping party, is when a bunch of people get together to edit OpenStreetMap, the editable map of the world. Here’s a guide for organizing a mapathon, based on my experience organizing mapathons in Bogota, Colombia, and Madison, Wisconsin, along with great advice from people who have organized OpenStreetMap mapathons all over the world (thank you, you lovely people!).
First things first: there is no one right way to run a mapathon. I will outline things to consider, but you don’t need to do all of these. Do what works for you, and lean on the wonderful OSM community for support when you have questions.
Find a co-organizer (or two)
A lot can go into organizing a mapathon, including the preparation beforehand and the actual event. It’s helpful to have someone to split up the duties with and who can help attendees on the day of the event.
Determine your priority
Are you organizing a mapathon to add a lot of data to the map, or are you trying to introduce new people to this awesome project? These things aren’t mutually exclusive, but it’s a good idea to keep your main goal in mind as you are planning and running the mapathon.
Find a location
Common places to hold mapathons include local schools, universities, businesses, libraries, restaurants, cafes, or parks. You’ll want to check on the internet connection at the location to try to avoid connectivity issues. Some mapathons have had issues when lots of people try to upload or download data from OpenStreetMap, so you can also check if different computers can connect through different IP addresses.
You might be able to find a local business or organization to sponsor the event to provide food, drinks, and supplies.
Decide on the format
Do you want to do surveying of a local area? Or will you be doing “armchair mapping,” contributing data to the map using aerial imagery or other existing data? If you are going to do outdoors surveying, I recommend meeting at a central indoors location to introduce everyone to OpenStreetMap, splitting up into groups to go off and map nearby, then coming back together at the end to add the collected data to the map. Think ahead of time of possible routes that people can take or certain things that people can map, like addresses, mailboxes, or restaurants. During the event, people are free to map whatever they want, but it’s good to have some ideas ready.
Pick your tools
There are tons of great tools out there to help you contribute to OpenStreetMap. You might want to stick with just one or two to teach at the mapathon so people don’t get overwhelmed by too many options. Or you can provide information about all of them and let people decide what works for them.
- For local mapping:
- Remote mapping:
Pick an editor
JOSM and iD are the main OpenStreetMap editors, with JOSM being the primary desktop application and iD as the main web-based editor. If you are expecting a bunch of new mappers, you should focus on iD, which has a simpler interface and doesn’t require a download. If you are an expert JOSM user, make sure to familiarize yourself with iD ahead of time so that you can help answer questions that come up. If you are planning to teach JOSM, make sure it is downloaded on available computers or advise people to download it on their own laptops ahead of time.
Reach out to other local organizations to see if they want to work together and help with outreach. This can include a local OSM Meetup group, Maptime chapter, school or university, or the general open-source community. Leverage other organizations’ email lists and social media presence. You can also organize a mapathon in coordination with a nationwide U.S. mapathon to get extra coverage.
Get out the word
Twitter is great, but there are a bunch of other ways to get out the word, too. You can put up flyers around the neighborhood, send out messages to local neighborhood or city lists, or post on Nextdoor.
Asking for RSVPs
It’s a good idea to ask people to RSVP for the event on Meetup, Eventbrite, or Facebook. People can still show up at the door, but it will help with planning if you have an idea whether you’ll have 5 or 55 people.
The day of
Intro to OSM and the tools
Give a brief introduction to OSM. Some people prefer to do this as a 10-30 minute presentation at the beginning of the event, while others prefer giving 3-5 minute presentations throughout the event on different facets of OSM. You can talk about what OpenStreetMap is, different ways to contribute to the project, give examples of ways that it is used, and why open data is important.
Be sure to introduce the tool(s) that you’re going to use during the mapathon. You can mention additional tools that people can use, but focus on just one or two.
Hopefully you’ll have people at your event who have never even heard of OpenStreetMap. That’s great! If a new mapper has a positive experience at a mapathon, that will make them more likely to support it in some way. There are lots of ways to do this:
- Emphasize that you and other experienced mappers are available to help with any questions.
- Match up experienced mappers with new mappers so that they have a dedicated person who they can go to with questions.
- Print “Get Started” guides with basic information about OSM and how to edit.
- Make sure people know that they don’t need to stay for the whole event.
- Point to additional resources that people can turn to after the event is over.
Go out and survey or start making edits. If you are doing outdoors mapping, make sure someone stays behind at the meeting location in case people show up late or one of the participants has mobility issues.
Most importantly: have fun
Be enthusiastic and have a great time. You are the best ambassador for OpenStreetMap and getting people excited about contributing in whatever way they can.
If you were able to get a sponsor, you can do something social after you’re done with the serious stuff.
If you need more help, here are a few additional resources:
Card photo by Harry Wood
TL;DR We need to ship D8. ;)
I was sent this question today from a co-worker:
"We always talk anecdotally about how Drupal adoption slows before a new release and then picks back up. Do we have any data to support that for Drupal 7 or Drupal 6? I’d love to know the impact of Drupal 8 as well – but not sure that’s possible. Any thoughts?"
This is a great question, but since email is where information goes to die ;), I figured I would copy my response into a blog post as well.
Show me the data!
Since D8 has been in development so long, we don't have enough data showing on https://www.drupal.org/project/usage/drupal anymore since it prunes it after 3 years. :(
This only goes back to June 2008 which is after D6 came out, so it's not ideal, but we can still glean some useful data out of it.
Here is a screenshot of the data from just prior to Drupal 7's release in January 2011:
- In December 2008 there were 77K installs of D6 (compared to 0 in January since it wasn't out yet :)) (77K% increase). This is when D7 was in active development.
- At the end of 2009 there were 203K installs of D6 (163% increase). This was when D7 was in feature freeze.
- At the end of 2010 there were 323K installs of D6 (59% increase). This was when D7 was just about to ship.
- At the end of 2011 there were 292K installs of D6 (9% decrease). This is when D7 had been out for about a year and several key contributed modules were ported.
- D6 usage has been declining ever since, and is currently at about 135K installs.
Here is the data from 2011 to today:
- At the end of 2010 there were 6.5K installs of D7. This is when D7 was just about to be released.
- At the end of 2011 there were 230K installs of D7 (3438% increase). This is when D7 had been out for about a year and several key contributed modules were ported, and D8 was just beginning development (was mostly D7 bug fixes at this point). Of note, D7 usage eclipsed D6 usage just a few months later (Feb 2012).
- At the end of 2012 there were 522K D7 installs (127% increase). This is when D8 was nearly done with feature development.
- At the end of 2013 there were 728K D7 installs (39% increase). This is after D8 was in code freeze.
- At the end of 2014 there were 869K (19% increase). This is when D8 was in beta.
- As of last week (mid-2015) there were 984K installs (13% increase). D8 is currently still in beta, with ~25 critical issues remaining before release candidates.
There are a few patterns we can discern from this data:
- There is an enormous uptick in Drupal usage every new major release (though it's delayed until it reaches a "stable" state, i.e. after enough contributed modules are ported).
- After that initial year or two of exponential growth, it slows down a lot.
- The closer the next version is to being released, the slower the growth is of the current version. Generally, this is because people will postpone projects and/or use non-Drupal solutions to avoid incurring a major version upgrade.
- Usage of the older stable version starts to decline after the newer major version reaches the "stable" state.
Why Drupal 8 will make this more betterer
There are a few enormous shifts coming with D8 that should change these patterns significantly:
- Drupal 8 is much more fully-featured out of the box than any of its predecessors, so for many sites there is no need to wait on any contributed modules to begin building. Therefore we reach "stable" state (for sites that can do what they need to with just core) at Day 0, not 6-12 months later.
- A number of key contributed modules that delayed porting of other key contributed modules in D6/D7 (Views, Entity Reference, Date, etc.) were moved into core in D8. So they're available right now—even before release—to build on. And indeed we're seeing other big ecosystem modules (Commerce, Rules, etc.) porting now, while D8 is still in development.
- D8 will end the 3-4 year "big bang" release cycle. Instead, we'll be doing "small bang" releases every 6 months with non-backwards compatibility breaking feature/API improvements. That means we should hopefully stave off adoption decline much longer, and possibly even sustain the "hyper adoption" rate for much longer.
- We will still eventually have a D9 "big bang" release (3-4 years from now) with backwards compatibility breaks, but only after it's amassed enough awesome functionality that couldn't be otherwise backported to D8. This will provide us with another "epochal" marketing event that D8 is giving us today (well, soon) in order to drive adoption even further.
Sorry, that was probably Way Too Much Information™ but hey, the more you know. ;)
Backups are very important for every application, especially if a lot of data is stored in your database. For a website with few updates it is not so important to do backups regularly, you can just take the backup of last week for restoring the site and if there was just one or two updates, you can add them manually afterwards. But if you run a community site with user generated content and a lot of input the topic backup & recovery becomes a lot more important but also complex. If the last backup is from last night you have to consider all the updates that were made in the meantime.
Join Amber Matz as she chats with web accessibility aficionados Mike Gifford, Chris Albrecht, and Helena Zubkow about what web developers and Drupalistas can do to build more accessible web sites. How has web accessibility changed over the years? Who is being left behind? What are some common gotchas? What are some easy ways to get started testing for accessibility? All these questions and more are discussed in today's podcast. Don't forget to check out the links and resources in the show notes for all sorts of useful things mentioned in our discussion.
We recently built several highly usable imagery browsers. Each allow users to get right to the data they want by browsing a map. This interface works well with Landsat imagery, which has a consistent coverage area. For large, complex imagery datasets like OpenAerialMap we created a new type of grid interaction.
Designing for complex map data
Last week we launched the beta version of OpenAerialMap a tool for finding open satellite and drone imagery. We knew that even in beta, usability is going to be critical to the adoption and success of OpenAerialMap.
OpenAerialMap is a site that has to show a lot of data. It features imagery from different providers, with different aspect ratios, and captured on different dates. Presenting such data in a meaningful and usable way was challenging, especially considering that individual areas can contain multiple imagery.
Enter the grid
Battleship board game. Image by John Morgan.
Drawing inspiration from hexgrid experiments using Turf and good old battleship game, we created the grid for showing the density of imagery at any area. The grid breaks the world up into units that are easy to interact with.
We used an “always square” grid disregarding the map projection. The result is a beautiful, clean, easy-to-use grid where all interactions with the map are consistent and as visual appealing as possible.
We color grade each grid cell according to the number of images intersecting them. This way we avoid bloating the map with useless information and provide a sense of density. The grid works as a visual guide where at a quick glance the user can easily grasp where and how much imagery is available.
The grid in action.
Selecting a cell reveals a panel with a list of imagery. From there, the imagery can be previewed on top of the map, downloaded or, when available, used as the baselayer of another map.
Designing usable map interactions requires real thoughtfulness. But it makes all the difference between a tool that people love and one that is forgotten.
In this series of posts we’re going to dig into some of the fundamentals of Drupal 8 theming. By the time we’re finished we’ll have a solid understanding of how to apply many of the new tools and techniques in our work. We’ll also have a starter theme we’ll be able to use in our future projects.
We’re going to begin by building the bare minimum required to get our theme working. We’ll create the basic file structure as well as a critical configuration file so that Drupal will recognize our theme and let us enable it.
We will be at State of the Map US this weekend at the United Nations. It promises to be a great weekend, packed with interesting people, talks and other events. If you want to catch up come show off your Lego skills at our table, or join us at any of our sessions:
- OpenAerialMap - Birds of a Feather
Saturday, 4pm (tentative)
OpenAerialMap is a platform to access openly-licensed satellite and drone imagery. Discuss the architecture and development roadmap for OpenAerialMap and the Open Imagery Network. Find Nate for more info.
- OSM as a platform - Birds of a Feather
Sunday, 10am (tentative)
What if we could use the software that powers OSM in other collaborative mapping projects? Its harder than you might think. We’ll trade tips on running an OSM infrastructure and identify a roadmap to make it easier. Hit me up on Twitter if you want to know more.
- OSM as a platform
Sunday, 3pm - room CR2
I will talk about OpenRoads, a platform for The Philippine government to manage their road network built entirely on OSM software.
- How to organize a mapathon
Sunday, 3:30pm - room CR3
Catch Robin’s lightning talk on organizing a mapathon.
- OSM Metadata - Birds of a Feather
To be defined
Interested in using OSM’s changeset meta data to learn about the OSM community? Get in touch with Marc and keep an eye on the BOF whiteboard.