Insider started from a simple need. Many employers and brands have a hard sharing their company's authentic culture with jobseekers. Smart marketing and curated company photos can only offer so much when telling the story of what the real experience of being an employee at said company is like. At Indeed we noticed a gap for many companies where their story was being driven by the reviews left by employees who elected to leave one and often these reviews didn't provide an accurate reflection of the true culture.
By following a Lean UX approach, we focus on a greater level of collaboration with the entire team. The core objective is to focus on obtaining feedback as early as possible so that it can be used to make quick decisions. Our process was to work in rapid, iterative cycles to ensure that data generated can be used in each iteration.
After the project was accepted by Indeed's Incubator we began with a week long design sprint to bring the team to alignment on a shared mission and vision. The sprint was very hands-on and incorporated the voices of marketing, design, engineering, management, and senior leadership. Through this process we talked openly within the team about what problem we were setting out to solve and what obstacles we may encounter along the way. Our days were organized to include user interviews, brainstorming, ideation, lean canvasing, wire-framing, and user testing.
Each day was filled with collaborative exercises meant to inspire and facilitate the team to focus on a single target. For the first day we focused on understanding the problem, listening to potential customers, and choosing a target for the sprint. This was followed up with a day of focusing on the solution by way of sketching and critical thinking. Once we had a large stack of proposed solutions we then spent a day prioritizing which solutions were strongest and then began storyboarding what that process may look like for the customer. From there we built a fully interactive prototype from low-fidelity wireframes for user feedback and testing on the final day of the sprint.
From this work we had envisioned, executed, and tested a review collection tool built on the backbone of an engagement survey. We learned that employees are willing to provide authentic reviews when they are primed to provide feedback directly to their employers. The mentality was "whats in it for me?" and employees who felt they were truly being heard by their employers were more willing to leave a engaging review.
Once we felt we had a solid MVP I then mapped out the journey of both our user and internal teams into a timeline of user actions that describes the relationship between our product and customers. This visualization identified all of a user's interactions with our product, from their point of view in order to provide a customer-centric approach to product building, which ultimately leads to better customer relationships.
As we started on-boarding our first customers, I conducted on-going interviews and surveys to better define the types of customers we were trying to attract. From this insight I was able to distill our ideal customers down into through persona types. The first represented medium sized businesses who's main goal was to get a coverage of reviews to publish while gaining insight into their workforce's satisfaction and engagement. The second persona type was built around medium to large sized business who currently have reviews posted on Indeed but felt they were inaccurate, and outdated. And the final persona type was built around small business who had no content on their company page and were new to the idea of utilizing feedback and review collection from their employees.
Since our product was going to touch both the employee side and employee side of a business I also generated two unique persona types around the users who would be providing the content for their employers. The first persona was formed around the employee who felt open to share their experience and we were willing to provide their honest interpretation of their experience. The second persona type was built around the employee who felt hesitant providing their employer with feedback out of a fear of lack of anonymity.
When it came time to launch we were offering a very bare bones MVP built by 3rd party tools in order to ensure we could actually deliver our ideal solution to employers. By focusing less on building out a complete product we were able to focus on acquisition cycle which initially was heavily sales focused. By taking this route were were able to gain real time feedback around the hesitations and roadblocks employers faced when deciding to try our beta product.
Once we had a basic product built from the tools we had available we started testing our position and value propositions to ensure what we were offering resonated with our customers. By focusing on the beginning of the funnel we could later build out an optimized product from the signals we received at our first touch-point.
The earliest step of our funnel begins with a marketing landing page which went though many iterations to optimize copy, feature callouts, benefits, styling, and imagery. As we drove traffic to our page we were able to identify winning combinations that led to significantly higher click-through-rates.
Being a part of Indeed's ecosystem means we have instant access to potential users and an ideal platform to market within. A critical component of our acquisition includes strategic messaging at various pain points an employer may experience throughout our site. For one example, we positioned promos that targeted employers with no-to-low reviews and marketed our solution as a way to remedy this.
The promo approach relied on users who were actively visiting Indeed and their company page in order to be effective. But what about the employers who were less active on the site? To bring in less active users we built a marketing outreach to further test our value props while building a new lead generator for our sales team.
Once our process was up and running and we began to see real growth early on in our product's life cycle we decided the biggest lift to ensure proper scalability was in providing a self-serve portal so our users could start on-boarding themselves without the direct contact from our sales team.
By building the client intake before optimizing the product, we were able to get signals on what features and offerings employers were most interested in. These insights were critical in helping us determine how we would build out our core product in the future. One test included a preview of survey packages to learn which types of surveys employers felt were valuable.
When we started sending surveys to employees we tested the open and completion rates for surveys sent from Indeed vs surveys sent from the employer. We found that this was a tricky issue to find the right positioning. In order to reduce the appearance of spam we incorporated the names and branding of the company while balancing the need from employers for this survey to appear unbiased and under the supervision of Indeed as an external party. In order to do this, we asked employers for company details that we would brand their surveys and survey outreach with.
Hands down the most critical step of our process is getting structured employee data from employers. This is a big lift for many companies both for privacy and technical reasons. Since we can optimize for technical restrictions we decided to test a flow that made it super simple to upload contact details in bulk. This solution included a templated CSV upload, manual entry, and Gmail integration. This proved to be a big win as we made it as simple as possible for employers to share their existing data.
In an effort to ensure employees felt comfortable about the legitimacy of our survey and while balancing the voice of the product, we automate the introduction of our survey by prompting employers to write an introductory email that will go out the week before the survey launches. Through testing we found this step provided a large lift in the email open and survey completion rates as it informed employees why this survey is important for both them and their company. By being transparent employees feel vested in the process and are therefore more willing to participate.
When sending our initial test surveys we experimented with the delivery timing to determine which days and durations yielded the highest results. From this insight we decided to provide a rigid scheduling structure that automates the delivery of introduction emails, survey outreaches, and subsequent reminder emails. By controlling this timing we can ensure each employer is primed for the best outcomes and allows us to test new hypothesis in a controlled environment.
After our client intake was out in the wild and collected data we reconvened to analyze which steps in the funnel were performing better and worse than others. By focusing on each step and pairing this logged events we were able to identify that the upload step was producing the largest drop-off. Through user research and from reviewing the data we've collected from employers to-date we determined that additional error handling and interactivity was necessary to facilitate successful data uploads.
We took the position that no matter what types of data, and no matter how it was structured, that our app should accept, normalize, and provide the ability to edit and update without asking the employer to leave our product to make any critical changes. With more robust error handling, employers can identify any incongruities in the data from the self-serve intake and take appropriate actions with minimal work.
Once our funnel was somewhat optimized and working in our advantage we then placed more focus on our core survey product. The MVP survey we built was performing well and provided real value to both employers and employees so we were fortunate to be so successful early on. Once we had the bandwidth, however, we decided to test new hypothesis in order to enhance and improve the current value we offer.
When testing our survey, I noticed some friction around quantifying experiences and opinions. After testing star ratings, a 10-point sliding scale, and traditional likert scale, I found that by color coding answers and displaying text to describe the scale, employees were less spontaneous with their answers and overall we saw a boost in positive responses. Thy hypothesis was that seeing a red response forced the employee to think deeper about their response.
Working closely with Indeed's engagement team, we brainstormed on questions that provided meaningful insight to the employer and questions that targeted the real experience of employees. This brainstorming led to the release of questions that built around a LEAD framework which categorized responses around Leadership – Enablement – Alignment – Development.
When we automate the delivery process for employers, employees started questioning the anonymity of our survey. Knowing that their surveys are tied to their email addresses, job titles, and locations they posited real concerns about how anonymous their responses truly were. In order to combat this we speak very transparently about how we protect employees and how we take steps to obscure data that could identify them.
As surveys were collected we noticed a group of employees that seemed to go above and beyond when providing content for their employers. We called these individuals our super surveyors and began optimizing the survey to collect more content from those willing to give it. At the end, we provide follow up questions about perks, benefits, salary, and work-life and drip-feed questions to users we identify as super-surveyors. In doing so we can continue to provide more content and insight for employers and jobseekers alike.
Employee feedback to employers is only valuable when the insight is actionable. When we started displaying metrics that were underperforming, employers wanted to address any underlying issues but felt the responses didn't tell the entire story. In order to provide the detail they were seeking, we built a feedback mechanism that allows employees to expand on their response by providing the ability to add comments along with their survey responses.
Once we dialed in the mechanisms of our surveys, we then tested the presentation. Early tests included displaying the survey on a single screen, displaying questions one at a time, and eventually we found that by breaking out our survey into the satisfaction survey, public review, and micro-content steps, employees were better primed at focusing on the task at hand. The winning variant helps reduce mental friction for employees about which content is for their employer and which content will be published publicly. In doing so, employees can be candid with their feedback knowing their review will be separate from any criticism they may provide to their employers.
Once we started providing reporting on the survey results to employers their main feedback was that they felt the data was very valuable but they weren't sure what to do with it next. For companies that hard a large number of responses, combing through the data was intensive and being able to share the results with key stakeholders proved to be more work that they felt ill equipped to provide. In order to drive employers to publish their results they needed to feel satisfied that the data was representative of their company and worth publishing.
Once results start coming in from open surveys, employers were asking for snapshots of their progress. We built and MVP dashboard that provided real-time metrics that ended up becoming our survey summary report. This dashboard highlights all of the top-level data points from the survey and helps employers identify the health and success of their survey.
By collecting data rich responses we offer the ability to segment responses by location, department, and job title while providing top-level metrics for the LEAD categories our engagement questions were structured around. This helps highlight specific areas that may be underperforming to provide the necessary feedback for employers to know what changes may need to be made within their organization..
In order to close the feedback loop, we surface the qualitative responses left by employees so employers can dig deeper into their unique experiences. From this section, employers also have the ability to message their surveyors directly while still protecting the participants anonymity. By keep communication within our platform employees and employers can work together to address any issues without the fear of repercussions.
Although we only allow employers the ability to publish all or none of their reviews in order to eliminate the ability to "cherry pick" reviews, we still provide insight into how their ratings stack up and to identify any inflammatory responses that may need to flagged for moderation by Indeed. While we seek the authentic voice of employees we also take steps to ensure sensitive data is not exposed in process of publishing reviews.
What the research team from Google noted was that while small scale frameworks were common place measuring the experience on a large scale via automated means had no framework in place. Thus the Heart Framework is specifically targeted at that kind of measurement. However, the principles are equally useful at a small scale level; though the methodologies used to derive measurements at a smaller scale are likely to be substantially different.
We have the privilege of boasting one of the highest NPS scores within Indeed. Through feedback loops and continuous improvements we have maintained a healthy score while scaling our product.
In tracking engagement we focused on survey setup starts as we felt this was a good metric at gauging interest in our product.
Successful survey setups lets us know that our client intake is performing optimally and as we scale our goal is to unsure any updates to our process don't negatively impact this matric.
The timescale for Insider is unique as many employers don't have a need, yet, to continuously run surveys. As our product lifespan matures we hope to see more users running repeat surveys.
A key metric we are also highly proud of is our high publish rate. Employers who run an Insider survey are 95% percent likely to publish their results meaning we are successfully capturing valuable content and providing the value we have promised.
This project was built upon the work and research of multiple teams spread across Indeed. Through our efforts we have connected with our internal employee engagement team, have run internal surveys with departments through Indeed to "dog-food" our product, and have partnered with the Indeed Company Page team in Tokyo to integrate content publishing so that data is successfully transmitted, moderated, and visible to jobseekers on an employer's company page. The relationship and shared goals we have defined throughout this process has been intangible and has been a true highlight of working within a company as large and diverse as Indeed.
By providing such a valuable service to employers on Indeed, we have empowered them to adopt our ecosystem more as they invest in generating content for their company page.
Indeed Incubator is set up on a metered-funding framework. As we continue to prove our worth we are awarded continued funding which has recently surpassed $1 million dollars.
Revenue is always great but content is just as valuable for Indeed's success. Insider has provided a new stream of content from employees to boost SEO and traffic for our clients.
Once our product has demonstrated its worth, we plan to partner with the existing product team for an eventual acquisition so that this product will continue to scale and expand beyond our beta.