The Future of Culture Assessments

Background

Building a positive workplace culture has become one of the main priorities for organizations across the world. Many employers are switching their focus to building workplace environments in which their employees can feel safe, engaged, inspired, and productive.

Experts and industry leaders believe that workplace culture is what separates the most successful companies from the average ones. Positive culture in the workplace increases employee retention, builds engagements, and increases the bottom line.

With all the positive outcomes good culture brings, organizations are looking to find effective ways to understand their culture. Many organizations focus on “work areas” or “climate surveys” which can be useful in some cases, but do not identify root causes or comprehensive understanding of what is happening in the organization. They only give an understanding of the current state. A cultural assessment is needed to really understand what drives culture and where improvements can be made.

Since 2014, TIVC has been providing federal, state, and local government services in Human Enterprise Optimization (HEO) to include Organizational Optimization, IDEA (Inclusion, Diversity, Equity, & Accessibility), Training & Development, and Strategic Communications. We specialize in assessing organizational cultures through the use of surveys, focus groups, and policy reviews; breaking down information for qualitative and quantitative information to glean insight on how people perceive and feel about their environment. TIVC has leveraged our subject matter experts and industrial psychologists to lead the development of our AI – Natural Language Processing tool, helping ensure it does not develop inherent bias and provide the feedback necessary to be actionable.

Existing Tool & How It Works

This tool is designed to provide the user with a comprehensive assessment of an organization’s workplace based on employee feedback from Focus Groups, Interviews, and Surveys. This is done by: Finding topics in the organization’s workplace policies; Computing similarities between the policies and the employee responses from Focus Groups, Interviews, and Surveys; Analyzing the sentiment, emotion, and tonality present in the employee responses; Generating plots for the analysis report.

Methodology

Data Processing

The data is processed based on the models it is fed into (identified in the overview below). All text data is cleaned using the standard steps of breaking the data into words or tokens, converting them to lowercase, removing stop-words, discarding tokens that are not alphanumeric, and finally reducing the tokens to their word stems.

Note that responses from Focus Groups and Interviews are processed in a similar way. This differs from the way Surveys are processed. The app takes into account only the Survey questions that have responses as the degrees of agreement or disagreement (Strongly disagree, Disagree, somewhat disagree, Neither agree nor disagree, Somewhat agree, Agree, Strongly agree). It analyzes the Survey question’s sentiment and then combines it with the proportion of responses that agree or disagree with that question, to result in a sentiment score.


Overview of Models

The tool employs three models: Latent Dirichlet Allocation (LDA) Model for topic modeling, Sentiment Analysis model for sentiment detection in user responses, and an Emotion Analysis model for emotion detection in user responses.

  • LDA Model: An LDA model is created and tuned on the Policy data. The model results in clusters of topics found in Policy data. For instance, if there are 3 policies talking about harassment, the model clubs them into one topic. Next, the responses from Focus Groups, Interviews, and Surveys are mapped to the policy clusters obtained above. This is done to analyze which response refers to which policy.
  • Sentimental Analysis model: Data is extracted from the 3 types of response files (Focus Groups, Interviews, and Surveys), and fed into the model. Three types of labels are predicted: negative, neutral, and positive, which are used to generate visualizations.
  • Emotion Analysis model: Data is extracted from the 2 types of response files (Focus Groups and Interviews) and fed into the model. Six types of labels are applied: sadness, joy, love, anger, fear, and surprise, which are used to generate visualizations.

Practical Application of Technology and Human Skills

TIVC’s Organizational Optimization pillar focuses on people-centric diagnostics and solutions. Our primary function is to assess culture, which we define as the shared beliefs, attitudes, behaviors, and values of an organization. TIVC’s unique culture assessment approach uses mixed-method research to test for alignment between organizational policies/procedures and employee experience. After completing our assessment, TIVC develops a comprehensive organizational culture and climate profile report. This report can be used to optimize culture by helping to inform and improve decision-making processes and strengthen workplace synergy, trust, and collaboration.

Culture is multi-faceted and therefore cannot be accurately assessed from a single perspective, lens, or skill set. With this in mind, TIVC has built a unique tool for assessing organizational culture and climate that leverages both the power of artificial intelligence and the nuanced perspectives of human subject matter experts, including highly trained and credentialed authorities on culture, climate, diversity/equity/inclusion, data analysis, and change management. While other organizational culture assessments on the market rely on either qualitative or quantitative data, giving an incomplete picture of organizational culture, our approach triangulates data generated through qualitative and quantitative data-collection methodologies with data from your organization’s policies to offer a 360-degree view of how written policies align with their practical application.

Our process begins with an assessment of employee sentiments and emotions using AI and Machine Learning Models. TIVC’s algorithm determines word frequencies and groups them for further analysis. Our AI tool has a sentiment and emotion detection capability which contextualizes topics and themes. The tool also analyzes polices to identify similarities and dissimilarities between stated policies and themes from employee responses.

After the AI tool creates an initial report, TIVC’s subject matter experts analyze the data to identify root causes and make recommendations to optimize organizational culture. Our unique approach leverages technology and human expertise to render an objective and data-driven profile report.

Conclusion

Using our unique AI – Natural Language Processing tool, we are able to identify organizational patterns, trends, collective beliefs, and perceptions and articulate how they impact culture. Using human expertise ensures the data is contextualized. This provides our customers with an unbiased snapshot of their existing cultural state and the ability to implement data-driven solutions.

 

Download white paper

I worked as a public affairs officer for the Department of Defense (DoD) for over ten years. The goal of public affairs is to educate and inform the American people, build and maintain a good reputation for the agency, influence public policy, and develop good relations with stakeholders. I worked with thousands of military and civilian leaders and personnel over the years, and sadly most of them were unaware of the importance of public affairs to the command’s mission effectiveness as well as to their own work and even job security.

 

Public affairs officers (PAOs) or specialists have many roles to fill—journalist, editor, event coordinator, script writer, media trainer, outreach coordinator, crisis communicator, graphic designer and much more. They must be creative in their efforts as strategic communications is constantly evolving. But creative attempts within the government are often confined because employees are so often trained on what not to say rather than what they can say and why they should be talking more.

 

“Too often, subject matter experts (SMEs) or even leadership do not want to engage with PAOs as it is presumed that we are direct representatives of the news media rather than of the command. We do not pass everything we know to media, most especially sensitive or classified information. Our job is to fully understand the command’s mission and tasks and communicate that information to our various audience—primarily the American public,” said retired active duty Navy PAO and current Navy civilian Public Affairs Specialist Captain Joseph F. Gradisher, USN (Ret).

 

The U.S. government employs 1.8M civilians with over 676K working for the DoD and an additional 1.4M active duty members. Each person is contributing to the growth and development and protection of our great nation in their individual roles. And while most of these jobs are clerical or administrative, many of them support unique missions, including the conservation of wildlife, promoting peace with the Foreign Service, manufacturing and distributing coin, providing weather forecasting, preserving heritage with the Smithsonian Institute and so much more.

 

What you know about each of these organizations can be directly attributed to the efforts of each units’ Public Affairs/Communications Office. What you don’t know about government operations can be attributed to one of two reasons. The classification of a project could be above secret clearance. Most people tend to lean more towards this dramatic reasoning. However, from my experience as a government communications specialist, it’s more probable that Public Affairs Peter couldn’t get Scientist Susan to return his emails or phone calls about her really cool geology project and pictures from the field were not formatted correctly and pixelated leading to an unmarketable product.

 

“No one knows the work better than our SMEs. The PAO cannot be expected to know the details of what everyone in a command is focused on.  Rather, it is our job to talk to the SMEs, and help find ways to translate their technical jargon into plain English so the audience can better understand the issue. It is the partnership of the SMEs and PAOs that will most benefit the command and the audience,” said Gradisher.

 

Getting people to communicate with the communicators is a constant struggle. We often receive pushback because personnel feel it’s a waste of time or even against the rules to be interviewed or write about their special project. They often have the mentality of “I’m doing a good job at the work I signed on for, why do I have to do this ‘extra’ work?” or “I have a secret clearance so that means I’m not supposed to talk about any of my work.”

 

In reality, it could actually benefit you and your colleagues to share more with your public affairs office. Public affairs specialists act as unofficial lobbyists on behalf of their organizations. They sometimes do this by working with government officials directly on issues of public concern. But most often they do it indirectly by communicating the great work of the organization through different mediums to maintain a good reputation and ensure long-term support and success.

 

For example, when the creation of U.S. Africa Command was announced in February 2007, it “faced intense scrutiny and criticism throughout its early days.” Stakeholders included the DoD, USAID, African regional and national governments, U.S. Congress and the armed services staffing and funding the command. Some key concerns the communications team faced included distrust of Western powers given the U.S.’s colonial past with Europe and the invasion of Iraq, lack of consultation with African leaders before making the announcement and underestimating the overall change in overseas military programs, among other challenges. The communications team worked wonders through building messaging based on the importance of African partnership, leadership speeches and talking points, keeping everyone informed with internal meetings and external communication mediums, involving stakeholders in the construction process and many other tactics. “By September 2007, they had developed a very strong rapport… Many were Africanists who strongly believed in the idea of the command and wanted to see the DoD put more priority on African issues.” These efforts resulted in a formal establishment ceremony held in October 2008, and attended by U.S. ambassadors to Africa, German officials, State Department officials and USAID. (Galvin, 2019)

 

Public Affairs is part of your organization and most times can be directly linked to public acknowledgement of your overall success. They should not be treated as undercover spies trying to extract secret information or outcasts attempting to slow your work. They have a job to do and that is to advocate for your position and the organization you work for. So, take some time out of your busy schedule, offer them a chair and an introspective look at your work, and allow them to make magic with their little notebooks and pencils and exceptionally creative campaigns and messaging. The results will make all the difference for your organization.

 

TIVC’s mission is to help people work better together, and we are a proven leader in Human Enterprise Optimization recognizing that people are an organization’s greatest assets. TIVC was founded by Jean Payne in 2014. It is a CVE-certified Service-Disabled Veteran-Owned Small Business headquartered in Charles Town, W.Va. We have current and former contracts with government and commercial customers across the nation. Contact us today at marketing@tiverbatim.com for all your strategic communications needs.

 

References

Galvin, Thomas P. (2019) Two Case Studies of Successful Strategic Communications Campaigns. U.S. Army War College Press.

Often times, clients use the terms: study, culture assessment and evaluation interchangeably. It’s accurate that there is overlap between some or all of them, but there are important distinctions which should be highlighted to communicate clearly, avoid confusion and manage expectations.

 

 

Research Studies

 

To begin, the goal of a research study is to create new, generalizable knowledge that is reproducible and applicable elsewhere in similar contexts. Studies are typically conducted within the realm of higher education or specialized fields. Some studies involve the use of the scientific method, namely, testing a hypothesis by manipulation of variables called an intervention, and others are focused on developing a theory supported by scientific evidence.

One simple example would be if a research team at a university would like to identify the top five indicators for employee burnout in the tech industry. The research team would have to collect data from numerous tech companies and have a sample size that is large enough and representative of the general population of the location they are targeting (city, county, state, region, country). This enables their findings to be applicable to and reproducible within other companies in the tech field (not included in their sample) with probable similar findings.

Research studies are meant to be peer reviewed by other academics and then published. This type of review holds the study, particularly the methodology, to a very high degree of scrutiny. In this context, reliability, validity and statistical significance is very important. If the study involves human subjects, even though low-risk activities like interviews or observations, then there are strict rules which need to be adhered to and usually require Institutional Review Board (IRB) approval. This is different from an assessment or evaluation.

 

 

Culture Assessments

 

The goal of a culture assessment is to gather information from one particular group, organization or company to better understand the collective opinions, beliefs, views, feelings and behaviors regarding the norms, rules, regulations, policies and processes of the group. The collected aggregation of this information is utilized to identify gaps, better understand workforce perceptions and ultimately identify ways to optimize culture.

Culture assessments and research studies are both learning processes, but they are not held to the same academic level of scrutiny and serve distinct purposes. In a culture assessment the focus is not only statistical significance but also the practical significance of the findings.

Another important consideration is that culture assessments, even when they follow the exact same methodology rarely yield the same results from one organization to the next, even if they are in the same industry. This is due to variations such as demographic distribution, geographic location and different customs. The findings from a cultural assessment are designed to identify areas the need for course correction and to gather additional information to optimize culture. Culture assessments are diagnostic in nature and can lead to further formulation or refinement of research study questions or areas of improvement for later evaluation.

 

 

Evaluations

 

In addition, some organizations conduct evaluations. Evaluations are judgmental in nature. They involve the comparison of data against a standard for the purpose of determining the value, utility or extent to which objectives are met. Evaluations are summative conclusions of a product, project or program based on evidence collected. Whereas cultural assessments are diagnostic, evaluations are prescriptive. Evaluations will determine how well, or if, certain criteria meet standards and what improvements need to be made. Evaluations are done so that the person or organization being evaluated can understand how they measure up.

In our tech example, an evaluation would be the comparison of the percentage of employees who are burned out in one company to other companies in the same industry. This gives the evaluated company the metrics of whether they are performing better, worse or about the same as others. Evaluations can also be used to formulate or refine research study questions and conduct further assessments.

 

 

TIVC provides expertise in conducting cultural assessments, which aim to determine what’s working, what’s not working and what’s missing in an organizational environment. The tools used to collect data for research studies, assessments and evaluations can have considerable overlap, but it’s important to remember that it is the purpose for which the data is collected that is a primary distinction.

At TIVC, surveys, focus groups and interviews are some of the strategies used to collect insights into the inner nature of an organization. These methodologies provide a window into the attitudes and behavior of an organization, which can then be evaluated against existing organizational policies to determine if there is alignment between policies and practice.

TIVC’s mission is to help people work better together, and we are a proven leader in Human Enterprise Optimization recognizing that people are an organization’s greatest assets. TIVC was founded by Jean Payne in 2014. It is a CVE-certified Service-Disabled Veteran-Owned Small Business headquartered in Charles Town, W.Va. We have current and former contracts with government and commercial customers across the nation.

 

References

Huitt, W., Hummel, J., & Kaeck, D. (2001). Assessment, measurement, evaluation, and research. Educational Psychology Interactive. Valdosta, GA: Valdosta State University. Retrieved from http://www.edpsycinteractive.org/topics/intro/sciknow.html

Levy, J. (2017). How to Differentiate Assessment, Evaluation, and Research. Presence: A Modern Campus Company. Retrieved from:  https://www.presence.io/blog/how-to-differentiate-assessment-evaluation-research/

McGillin, V. (2003). Research versus assessment: What’s the difference? Academic Advising Today26(4). Retrieved from: https://nacada.ksu.edu/Resources/Academic-Advising-Today/View-Articles/Research-versus-Assessment-Whats-the-Difference.aspx#:~:text=While%20research%20focuses%20on%20the,or%20decision%2Dmaking%20and%20budgeting

Surbhi, S. (2017). Difference Between Assessment and Evaluation. Key Differences. Retrieved from: https://keydifferences.com/difference-between-assessment-and-evaluation.html