We're building value and opportunity by investing in cybersecurity, analytics, digital solutions, engineering and science, and consulting. Our culture of innovation empowers employees as creative thinkers, bringing unparalleled value for our clients and for any problem we try to tackle.
Empower People to Change the World®
Booz Allen Commercial delivers advanced cyber defenses to the Fortune 500 and Global 2000. We are technical practitioners and cyber-focused management consultants with unparalleled experience – we know how cyber-attacks happen and how to defend against them.
Our strategy and technology consultants have empowered our international clients with the knowledge and experience they need to build their own local resources and capabilities.
In facing challenges of modernization, our Middle East and North Africa clients have complex requirements that benefit from our proven experience in guiding major programs and projects for governments and private-sector organizations. The services we offer in UAE, Qatar, Egypt, Turkey, Kuwait, Morocco, Jordan, and other regional countries build on our consulting legacy.
Our clients call upon us to work on their hardest problems—delivering effective health care, protecting warfighters and their families, keeping our national infrastructure secure, bringing into focus the traditional boundaries between consumer products and manufacturing as those boundaries blur.
Booz Allen was founded on the notion that we could help companies succeed by bringing them expert, candid advice and an outside perspective on their business. The analysis and perspective generated by that talent can be found in the case studies and thought leadership produced by our people.
Explore our featured teams and missions. Search openings and find out how you can support our meaningful missions.
Continue your mission with us. Get advice from our recruiting team, and browse our FAQs.
Seeking an internship or entry-level position? Learn about the impact you can make on our team.
Find out more about our application process, explore our benefits, and review our FAQs.
Learn more about Booz Allen's diverse culture and environment of inclusion that fosters respect and opportunity for all employees.
Learn how we’re driving empowerment, innovation, and resilience to shape our vision for the future through a focus on environmental, social, and governance (ESG) practices that matter most.
We've come a long way delivering innovative solutions. But our next chapter is still being written.
Our 28,600 engineers, scientists, software developers, technologists, and consultants live to solve problems that matter. We’re proud of the diversity throughout our organization, from our most junior ranks to our board of directors and leadership team.
Booz Allen takes pride in a culture that encourages and rewards the many dimensions of leadership—innovative thinking, active collaboration, and personal service. We’re particularly proud of the diversity of our Leadership Team and Board of Directors, among the most diverse in corporate America today.
This article summarizes “Enhancing Trust in AI Through Industry Self-Governance,” which was published online April 2021 in Journal of the American Medical Informatics Association. Access the full article.
In an article for the Journal of the American Medical Informatics Association (JAMIA), Dr. Joachim Roski, Booz Allen health analytics leader; Dr. Ezekiel Maier, Booz Allen analytics leader; Dr. Kevin Vigilante, Booz Allen chief medical officer; Elizabeth Kane, Booz Allen health operations expert; and Vanderbilt University Medical Center's Dr. Michael Matheny, present insights that organizations of industry stakeholders can use to adopt self-governance as they work to maintain trust in artificial intelligence (AI) and prevent an "AI winter."
Industry stakeholders see AI as critical for extracting insights and value from the ever-increasing amount of health and healthcare data. Organizations can use AI to synthesize information, support clinical decision making, develop interventions, and more—creating high expectations for AI technologies to effectively address any health challenge. However, throughout the history of AI development, streaks of enthusiasm have been followed by periods of disillusionment. During these AI winters, both investment in and adoption of AI best practices wane.
“To counter growing mistrust of AI solutions, the AI/health industry could implement similar self-governance processes, including certification/accreditation programs targeting AI developers and implementers. Such programs could promote standards and verify adherence in a way that balances effective AI risk mitigation with the need to continuously foster innovation.”
- “Enhancing Trust in AI Through Industry Self-Governance,” JAMIA, April 2021
Today, publicity around highly touted but underperforming AI solutions has placed the health sector at risk for another AI winter. To respond to this challenge, we propose that industry organizations consider implementing self-governance standards to better mitigate risks and encourage greater trust in AI capabilities.
Building on the National Academy of Medicine’s AI implementation lifecycle, we created a detailed organizational framework that identifies 10 groups of AI risks and 14 groups of mitigation practices across the four lifecycle phases. AI developers, implementers, and other stakeholders can use this analysis to guide collective, voluntary actions to select, establish, and track adherence to trust-enhancing AI standards.
Without industry self-governance, government agencies may act to institute their own compliance requirements. However, industries that have proactively defined, adopted, and implemented standards complementary to government regulation have reduced the urgency of public-sector action while allowing for the appropriate use of available resources. Industry self-governance also enables exceptional agility to respond to evolving technologies and markets.
When considering self-governance, there are a number of key success factors to take into account. These include the creation of an industry-sanctioned certification and accreditation program. It’s also important to understand that self-governance success is based on stakeholder confidence that all standards and methods have been developed in coordination with consumers and patients, clinicians, AI developers, AI users, and other key parties.
While AI advancement continues with government support, there are also signs of a technology backlash, underscoring the need to mitigate AI-related risks. The government-led management of such public risks occurs in various ways; however, targeted, AI-specific legislation does not yet exist. Diverse organizations of health industry stakeholders could step in to help manage AI risks through self-governance. Using evidence-based risk mitigation practices could be effective across the industry and simultaneously promote and sustain user trust in AI, fending off the next AI winter.