{squarespace-headers}

Selections: What We Learned

On November 1, 2015, at 11:59pm Eastern, we closed the application period for our first class of Congressional Innovation Fellows.  During our October recruitment sprint, 213 technologists applied for the program. 

Knowing that our fellows would likely be quitting jobs and moving to Washington, we set a goal of choosing and sending offers to our first class of fellows by close of business on Friday, November 20th.  In the days between November 1 and November 20, we conducted three rounds of selections, including one round of interviews with finalists and a series of reference checks.  

It was harried couple weeks.  But we worked effectively and efficiently and had the benefit of stellar tech tools and an incredibly nimble and diligent Selections Board.  Here’s What We Learned about selecting tech talent for Congress in that very busy 19 days.

Applicants come in at the very end.  As of Friday, October 30, only 50 people had applied for the program.  By Sunday morning, we had just over 100 applicants.  That number doubled by the end of the day.  

We’d been worried that three and a half weeks wasn’t enough time to complete a rigorous application (which included three essays).  But our experience, I think, underscores the reality of Parkinson’s law-- as long as you give people enough time to get their work done, and set a clear, firm deadline, they will.

Screendoor is a life-saver.  I can’t say enough about Screendoor.  It was a lifesaver and gamechanger, and there is absolutely no way we would have been able to pull off such a rigorous selections process in such a short period without it.  Among the features that were hugely helpful were:

  • An incredibly simple interface that was friendly and accessible to even the least technically-savvy among us.  
  • Easy automated messaging which allowed me to segment and update applicants about the status of their application after each round.  
  • Easy batching and tagging so that we could group candidates at each level of selections.
  • Tiered permission level and viewing for different Selections Board Members.
  • An option for blinding or removing certain fields of the application from reviewers.  We used this to remove the name of the applicant during essay review in order to minimize any unconscious bias, which research has shown can can have a significant effect on application scoring.  

There are lots of other great features, but the best way to understand the platform is to have a look yourself.   

Use Calendly to schedule interviews.  We interviewed 10 candidates for our third and final round of selections, which occurred on Wednesday, November 18 and Thursday, November 19.  But our Round Two Selections Board delivered the final candidate list to us just five days prior, on Friday, November 13th, leaving very little time for error or for scheduling.  

Rather than trying to navigate the logic puzzle and logistical hurdles of scheduling each interview with each candidate individually, we used Calendly’s calendar integration and asked the candidates to self-schedule.  I imported my Google Calendar into Calendly, and shared it with the final candidates.  Candidates had 16 interview slots to choose from (one hour increments from 9:00am to 4:00pm each day).  We shared the Calendly calendar with each candidate and they were able to chose their interview time on a first-come, first-served basis.  Within 30 minutes of sending out the calendar to our final ten candidates, seven had scheduled their interviews, and by the next day, all ten candidates had picked an interview time.

Blinded fields are great, but how do you blind a resume name?  As I mentioned earlier, we integrated blinded fields into the essay review portion of selections, but when it came to the resume, applicants submitted a PDF copy.  How can you control for unconscious bias when needing to review a resume in PDF form?  The alternative-- requiring applicants to fill out the contents of the resume in individual fields which can then be individually blinded-- is not ideal.  I wouldn’t want to subject our applicants to that extra work.  I don’t know the answer to this, but we need to find the solution to this problem when we select our next class.  If you have ideas, please drop me a line.

Create a Structured, Standardized Interview Template.  We went into each interview knowing well ahead what we were going to ask.  We did this by building an interview template.  We only had 45 minutes with each candidate, and we wanted to be as efficient yet comprehensive as possible.  We wanted to understand the skillsets of each candidate, and minimize the variability that can occur by interviewing in order to help reduce the unconscious bias of interviewers.  As a result of having a standardized set of questions, we were able to get through the questions we needed to ask efficiently and easily compare the answers of the candidates.

Feedback about application language and essays is important.  We knew that the language we used for recruitment, and the questions we asked on the application would have a big impact on who applied.   We put together our outreach materials and the application and shared it with two colleagues-- LaurenEllen McCann, a Civic Innovation Fellow at New America, and Aliya Rahman, the Founder of Code for Progress-- familiar with hiring for diversity and inclusion. Based on their feedback, we made some significant changes to the application in order to broaden the appeal of the program.   Some of the edits included:

  • Changing the essay format, and splitting one longer essay into two, and emphasizing how applicants can highlight their strengths
  • Modifying many of the required qualifications such as length of work experience and type of experience
  • Adding our commitment to building a diverse and cross-sector technology policy ecosystem

We didn’t (and still don’t) know how to keep all these great applicants engaged.  We had 213 technologists apply for the program.  They ranged from an Iranian programmer and democracy activist to a cybersecurity expert at the State Department to a venture capitalist at the end of his career looking to give back.  A huge number of the 213 had incredible talent and experience that could be hugely helpful to Congress.  But we were only able to choose two Congressional Innovation Fellows. 

Our application wasn’t easy.  People didn’t apply on a whim.  They applied because they believed they had a unique skillset and perspective, and they wanted to serve.  How do you keep these 211 folks involved and not-allow that energy to go to waste?   The short answer is that we didn’t know.

And we still don’t know.  At the end of the day, TechCongress has one employee, me.  And we have to focus on our Minimum Viable Product-- our first Congressional Innovation Fellows-- so that we can build and measure and learn and grow from what we are now.  But I know that there is a huge opportunity for the amazing talent that applied for our program but that we weren’t able to select for our first cohort.  And we need to figure out a better way to harness it.