Class of 2026 Cadets Boston Graf, Maksymilian Olszowka and Elizabeth “Ezra” Bardales labored on a capstone project, a philosophy-centric venture as part of the [Department Law and Philosophy](https://www.westpoint.edu/academics/departments/law-and-philosophy), focusing on the “Training Ethical Employment of Lethal Autonomous Weapons Systems,” or TEELAWS.
The capstone explored software designed by Perceptronics Solutions Inc., in collaboration with the U.S. Military Academy, to teach current and future leaders to make responsible tactical decisions with sophisticated, novel Lethal Autonomous Weapons Systems.
Given a specific set of circumstances to delegate tasks to the weapons systems, the training takes a human or agent centered approach, focusing on how commanders make ethical and responsible decisions when using autonomous weapons.
The software is currently being tested in PY201 (Philosophy and Ethical Reasoning) at West Point, as it incorporates a Bayesian Ethical Decision-Making Model to analyze tactical decisions and provides an ethical score.
The lead of the cadet group is a Sacramento, California native and Philosophy major, Graf, and he has been working on the project for nearly two years since August 2024, also collaborating with last year’s capstone team. It began as a literature review of exactly what does it means for commanders to use Lethal Autonomous Weapons Systems responsibly, ethically and legally.
What they hoped to understand were the conditions in which a commander would be capable of exercising appropriate levels of human judgment in the use of force, as required by U.S. policy. Based on their research, the cadets developed a “Theory of Moral Agency” that captures the knowledge, skills, and abilities that a commander needs to have to make effective, ethical, and legal operational decisions. This theory became the basis for the training software that the Cadets developed with Perceptronics Solutions.
“We’ve been creating scenarios within the software for people to think through how to use the Lethal Autonomous Weapons Systems,” Graf stated.
As his cadet group prepares for the [2026 Projects Day Research Symposium](https://www.westpoint.edu/about/academy-events/projects-day-research-symposium/2026) on April 23 at West Point, Graf explained that there are four aspects of the software with each one corresponding to an aspect of moral agency that they defined.
The four aspects of moral agency are understanding, appreciation, choice and reasoning. Within the software used by the PY201 classes, the group receives feedback while fostering ethical reflection and moral discussion among cadets.
“What we’re doing is trying to ensure that commanders are able to be moral agents when using these things,” Graf described. “For example, in terms of understanding, we’re saying that’s the capabilities and limitations of the Lethal Autonomous Weapons Systems. Then, to use it responsibly, you need to understand what the thing is … through different curriculum. Here’s the capabilities, what you need to know, and the relevant information about the scenario.”
Graf spoke about the choice aspect, which is the software and going through the tactical decision games.
“What the user has to do is think through how they’re using the autonomous weapons systems in conjunction with human teams, plus the given environment, given the enemy – all those different things,” Graf said.
Then as the users go through the reflection and coaching pieces, it moves into the reasoning aspect of moral agency.
“We call them relevant mental models, and it’s just a way of conceptualizing how to use the autonomous weapons systems,” Graf stated.
The Bayesian Ethical Decision-Making Model being used in the software is about math within a diagram.
“It’s a mathematical way of representing how humans’ reason through making decisions,” Graf deliberated. “You take the relevant information you have and then take the new information and then you divide it by the old information.
“What this model does with the tactical decision games within the software, it’s able to take the different courses of action within each decision, or within each tactical decision game, and it analyzes it using this Bayesian math,” Graf added. “Then it comes out with what we call an ethical score. It’s basically saying how adherent is it to the law of armed conflict – the different principles of law of armed conflict.”
The Bayesian model not only scores each course of action in terms of how confident the model is that an action is compliant with the law but also identifies the situational factors that most influence that score. The latter feature, the sensitivity analysis, drives coaching within the training modules.
“What this does is we created tactical decision games as a way of thinking through moral agency, and then we combine it with the Bayesian model to get the TEELAWS software we have now,” Graf said.
As a future Infantry officer, Graf acknowledged there is a high probability he’ll be using Lethal Autonomous Weapons Systems and while working through the program and the uncertainty of situations, he said, “What we’ve really learned is the ethics part, how do you use the thing correctly, within many different considerations.”
When using the autonomous weapons systems in the software, Graf explained that “The tension really plays out in what tasks I’m delegating to the autonomous weapons system,” as they created a way of thinking called “Tension Tables.”
As the scenarios play out, Graf said for example, conducting an ambush is one decision, but there’s many sub tasks that need to be completed within the larger decision. Within the tension tables, they take those sub tasks, and they assign which agent is completing the different sub tasks.
“Is it the autonomous weapon? Is it human? Is it the commander? This is an effective way of conceptualizing how I am using the Lethal Autonomous Weapons Systems, and what tasks I’m telling it to do,” Graf said. “Do those tasks that I’m delegating to it; do they align with the capabilities or limitations of that Lethal Autonomous Weapons System.”
Then within the capabilities of the system is the factor of the ethics and how to use it responsibly when delegating the tasks to that system.
“There has been much discussion with this project that many times people will take tactics and ethics and put them on the opposite side of the spectrum, when we’re taking the stance that in most cases, the ethical decision is actually the best tactical decision,” Graf elaborated. “A slogan we use in the OP order is -- where are the ethics? … There are civilian considerations, but there’s no designated part saying, what’s the ethical implications of your decision.
“But in reality, ethics comes in the process of thinking through the mission as we go through our TRPs (Target Reference Point) and what our sections of fire are, all those different things when thinking about the decision that we’re making,” he added. “That’s where the ethics comes in when using these autonomous weapons systems. Oftentimes, when you come to the tactical decision, it’s in your best interest that it’s also ethically sound at the same time. It’s about thinking through what you’re doing, what are the implications of what you’re doing, and how will this lead to mission success.”
There are blurred lines encompassing everything that can happen in a real-life environment, when telling an autonomous weapon system to call for fire in an area inhabited by enemy, civilians and civilian infrastructure.
“It takes a contextual understanding of the environment to comprehend – do you need to call for fire?” Graf explained. “Is it OK to call for fire at the moment, given the context of the situation? It is a very interesting part in the reflection and discussion aspect to get people to start thinking about that, and I think that is what’s powerful in this tool.”
In the future, as there are more scenarios added and the software is more refined in its growth potential, Graf hopes that the TEELAWS software is something that can be used in every Basic Officer Leadership Course (BOLC).
Graf and his project mates are excited to present at the Projects Day Research Symposium, similarly to how they have presented this software to [U.S. Army Combat Capabilities Development Command](https://devcom.army.mil) (DEVCOM) and IBM in recent months.
The team’s advisor, Maj. Grace Ryan, [Department of Law and Philosophy](https://www.westpoint.edu/academics/departments/law-and-philosophy) instructor, is impressed with the cadets who delved deep into the complexities of the project.
“Throughout the project, the cadets have integrated philosophical methodologies from their coursework into broader frameworks of decision-making and problem solving,” Ryan said. “They have learned to break down challenging concepts in ways that meaningfully inform training software design. In addition, the cadets strengthened their collaborative skills by articulating their ideas clearly, building shared products, and engaging professionally with our collaborators and partners.
“Overall, they have demonstrated a strong ability to connect philosophical theory to military training and AI frameworks,” she added, “and to communicate complex ideas clearly in ways that transcend disciplinary boundaries.”
Ryan expressed the positives of this training software integrated into the TEELAWS curriculum and PY201 sections in giving instructors and cadets a platform to have profound discussions about their ethical reasoning and decisions on how, when or if they ought to use an autonomous weapons system.
“The coaching tool within the software aided in an analysis of ethical and legal considerations that the cadets use when making a decision,” Ryan articulated. “We were able to discuss how those considerations aligned with their mission, with their risk tolerance and, ultimately, whether the task they delegated to the autonomous weapons system was aligned with the capabilities.”
The software helps because it gives a glimpse into current U.S. Policy that dictates autonomous weapons be designed to allow commanders to use the appropriate level of human judgement over the use of force.
“I think it prepares (cadets) to best use their judgment when they are making any decision,” Ryan said. By helping (cadets) understand what they need to know about a system, their environment, mission and ethical/legal principles, they can appropriately delegate tasks to systems and maintain responsibility for their decisions.”
Ryan is pleased of the way Graf, Olszowka and Bardales have worked and grown over the last year as they prepare to be future leaders of character.
“They have accomplished so much and made a meaningful impact on the project and the research over the last year,” Ryan conveyed. “I think in part due to the reasoning commitment and rigor that they’ve brought to the project. Perhaps more so, I think it is a result of who they are as individuals and how they’ve approached a really difficult problem.”
“It has been such an awesome experience working on this project with them, and I cannot say enough about how proud I am of them and the work they’ve done,” Ryan added.
Olszowka touted how this capstone project, similar to classes at West Point, presented challenging questions while being tasked to tackle a novel problem.
“The capstone has been valuable in thinking through a problem that does not have a found solution,” Olszowka said. “I learned that the process is hard, and often, I wanted to give up. But seeing the work have a genuine impact to cadets (in PY201) and a genuine interest from academia filled me with hope.”
Olszowka felt the capstone project helped him as a future officer from the critical thinking and communication aspect.
“It certainly developed my ability to communicate hard problems and our solution to others, which is critical as an officer,” Olszowka concluded. “Furthermore, having a strong baseline understanding in LAWS and their ethical employment will have a direct practical impact on my future.”
The “Training Ethical Employment of Lethal Autonomous Weapons Systems (TEELAWS)” project is part of West Point’s 27thannual[Projects Day Research Symposium](https://www.westpoint.edu/about/academy-events/projects-day-research-symposium/2026), whichshowcaseshundreds of cadet-led research projects. Learn more about select project features and how to partner with West Point at[www.westpoint.edu/werx](http://www.westpoint.edu/werx).
| Date Taken: |
04.23.2026 |
| Date Posted: |
04.23.2026 10:33 |
| Story ID: |
563402 |
| Location: |
WEST POINT, NEW YORK, US |
| Web Views: |
19 |
| Downloads: |
0 |
PUBLIC DOMAIN
This work, Cadets work with software using Lethal Autonomous Weapons Systems that analyzes tactical decisions, provides ethical score, by Eric Bartelt, identified by DVIDS, must comply with the restrictions shown on https://www.dvidshub.net/about/copyright.