• Home
  • Unreal Engine
  • C++
  • Blog
  • About

Projects Blog

AI Social Deduction: Week 2.1

6/6/2021

0 Comments

 

Crewmate and Player Movement


Player Spectator Pawn

So far for the Player Pawn, I was able to get it to fly around and have no collision similar to how the Unreal Editor camera works. I also gave it the function where the camera will only turn when the player holds the right mouse button. The cursor will hide and lock in place when this button is pressed. If the right mouse button isn't held, the player is free to use the mouse for clicking in the scene which will be used later in the project.

I intend for the player to be able to hover the mouse over a crewmate and have them glow to indicate which crewmate they are going to "Select" if they click the left mouse button. Upon clicking on a crewmate being hovered over, the player will be able to see all of the information that crewmate remembers at that moment and which task they are about to attempt next. I initially planned for the player's camera to attach to that crewmate, but I feel that functionality would just take away from the player's ability to easily swap between "Selected" crewmates.

"Camera" Bug

During the creation of the Player Spectator pawn, I ran into a nasty engine bug which took 1.5 hours to resolve. Basically, the pawn's initial name was Player Camera, but Unreal apparently does not like the use of the word "Camera" in any created C++ class, because when I made a child Blueprint class from the Player Camera C++ class, the engine would instantly unload the class whenever the engine restarted. Turns out putting "Camera" anywhere in the name of a class causes the class to not stay valid in Blueprint form. I'm making note of this as a warning to any who read this about this issue and to indicate an extra detail I've learned in the creation of the Player Spectator class.

Waypoint Location Name

Picture
I was able to add the location name to the Waypoint class as an FString. I plan to use this name for helping an NPC to know which room on the ship they are headed towards. However, I'm also thinking of just having the tasks themselves be usable as the indicators to the Crewmate on where they should go to perform said tasks and just use the waypoints for the Crewmates' meandering around the map.

Main Menu

As shown in the video about, I am able to reset the game to the main menu at any point by pressing my new debug key: the "M" button. This is done by making a call to the ResetGame function in the AIDeductionGameMode. In said function, the positions of the crewmates are reset along with the player spectator, the player spectator's ability to move and turn is shut off, and the crewmates' behavior trees are stopped.

I was, however, unable to make an updated design for the main menu in the allotted time due to trying to correct the "Camera" bug.

What's Next?

This next week, I will begin the implementation of the core elements of the gameplay: the goals for the crewmates to win the game. This will mean that I will be splitting up the crewmates' behaviors between the killers and the innocents. Specifically, I will be working on:
  • Implementing the Tasks, in which a crewmate will be assigned a set number of tasks, and they will then move to said tasks along with their meandering. Once at a task, they will stand by the task zone until their task is considered "Complete". Even though they won't actually be doing anything, a Progress bar will appear above their head indicating how long their task takes.
  • Implementing the Killing, which killer crewmates will be able to kill off their fellow crewmates. A crewmate who is dead will become a ghost and will not be able to take part in dialogue later on, but will take part in ghostly meandering and task performing like in a normal Among Us game. Killer crewmates should not be able to accidentally kill each other, just the Innocent crewmates.
  • I will also try to smooth the movement of the crewmates a bit more if I can so they don't get stuck momentarily on walls as often.

If I have extra time, I will also implement:
  • The glowing of crewmates when the player hovers over them with the mouse.
  • Buttons for the right side of the screen which will quickly take the player's camera to where that correlating crewmate is for ease of finding crewmates of interest.

Want to Download the Project? 


Click here:
Github Repository
0 Comments

AI Social Deduction: Week 2

6/4/2021

0 Comments

 

Character Movement

Picture
In the past week, I was able to create waypoints and the beginnings of a crewmate. To do this, this, I hade to create the following classes:
  • AIDeductionGameMode - Begins by getting all the waypoints on the map and keeping them in a list. Also has a public function to provide a random waypoint from its list to give to the CrewmateController for when the crewmate is meant to meander around the map.
  • Waypoints - placed around the map (at the moment, one in each room) to provide locations for the crewmates to go to. They are indicated in the picture above as the red arrows.
  • CrewMateBase - The base (character) class for crewmates that can be Impostors and Innocents. Holds the perception component and has a Blueprint class made from it to hold the crewmate's physical appearance.
  • CrewMateController - AIController which is used to to hold and operate the crewmate's behavior tree.
Picture
The current behavior tree of the AI. I created and added the "PickRandomWaypoint" node to have the CrewMateController choose at random from the AIDeductionGameMode's list of waypoints for moving the crewmate to. It's quite small now, but I intend to add more soon to provide goal-based movement so that the crewmate can move to a desired task.
Picture
The classes created to begin working on the crewmate implementation and their movements.

Player Camera

I've decided the best way for the user to be able to spectate the simulation is to create a floating camera class. I intend to use this video as inspiration and converting it from Blueprints to C++ code. The user will be able to float around the map and keep an eye on all the crewmates as they roam the ship. I will also be adding a function where the player can click on an crewmate/press a button while looking directly at a crewmate. This function will then attach the player camera as shown in the YouTube video to the crewmate, and the information of what the crewmate knows and is doing will be displayed. 

Main Menu

Picture
I also created a very basic main menu on the first pass. The main goal of this early main menu iteration is to be able to reset the crewmates and the player camera back to their initial positions like they will do when a simulation ends. I have also set it up so that the crewmates do not begin performing their actions until the start button has been pressed.
​
To continue iterating on this menu, I will be working on:
  • Adding a debug button to have the main menu pop back up, replicating a game over. ​
  • Designing and drawing a more interesting main menu to the best of my abilities.

What's Next?

This week has been busy for me when it comes to other school projects, thus I fully intend to make more headway on this project over the weekend. I will be doing a "Week 2.1" update post on Sunday to show the progress I've made and then I will discuss Week 3 on that post. This weekend I intend to:
  • Get the Player Camera working with movement and attachment to a Crewmate. Displaying what the crewmate knows will come later.
  • Making waypoints hold data to indicate which room they are in so that crewmates can begin having goal oriented movement.
  • Get the debug button and better design for the main menu created and implemented.

Want to Download the Project?


​Click here:
Github Repository
0 Comments

AI Social Deduction: Week 1

5/28/2021

0 Comments

 

UE4 Observation System

I will be using Unreal AI Pawn/Controller system to program in the movements and interactions of the NPCs. On top of this, I will be using Unreal's AI Perception component to create the "Sight" of the NPCs since this project is recording details of what they "See" and not the vision of the NPCs itself.

​The UE4 Sight Perception works by setting the vision range and peripheral angles of vision on the individual AI. I will be giving all NPCs limited 180 vision to mimic the vision range implemented in Among Us. The Sight system depends on specific actors being set as AIPerceptionStimuli sources.

I will set all of the NPCs to be Sight stimuli. Then, I plan for the NPCs to know which rooms they are currently in or are entering. When one NPC sees another, they will pass their location to the NPC seeing them to provide the information of which room they are in/entering. This can also be used to pass over information if an NPC is dead or is in the middle of killing another NPC.
Picture

Data to Remember

Picture
Over the course of the week, I studied what data I wanted my NPCs to try and keep and eye out for what info to record. The info I want for the NPCs to actually "Observe" is as follows:

Observational Reasonings (all observations will come with a timestamp):
  • Basic (easier to implement, will be doing so first)
    • Seeing a character in a room
    • Seeing a character go from one room to another
    • Seeing a character kill another character
    • Noting what room the observer enters
    • Seeing a dead body
  • Advanced (Stretch goal)
    • Seeing a character attempt a task
    • Seeing a character come from a room with a dead body, and discerning this means they are the likely killer
 
Dialogue Reasonings:
  • Basic (easier to implement, will be doing so first)
    • Describing that the observer saw a character go from one room to another or were seen in a room and when
    • Mention the last character the observer saw/saw go in the murder room and when
    • Determine who was the most recent to enter the murder room or determine there is no info on who could have been the killer
  • Advanced (Stretch goal)
    • Lying they were in a different room than they really were
    • Estimate time of death

First Map Iteration

Picture
The first iteration of the map has been completed! While a very simple implementation that lacks in any of the Level Design flair, this map will suffice for the beginning of testing NPCs. Below, you can see each room has a designated name. The level was inspired by the Among Us map, but I decided to get rid of the narrow hallways and make the map more evenly structured to allow for easier observation by a spectator.
Picture
The original sketch.
Picture
The map comes with twelve different rooms. Every room comes with a point of interest, which is where the NPC will run to to perform tasks. The cafeteria's point of interest, the table, will instead be used as the point for the emergency button.
Even though it looks very unappealing now, the map's overall looks (not the layout) can be updated piece by piece over the course of the project to make something more pleasing to look at, which is what I intend to do. The layout can be changed if it turns out there are not enough rooms to give the killers a fair chance at being alone with a victim.

Want to Download the Project?


​Click here:
Github Repository
0 Comments

AI Social Deduction: Proposal

5/28/2021

0 Comments

 
Picture

Project Summary

For my 10 week Programming 3 Personal Project, I will be creating an Among Us-like social deduction in Unreal and then create AI NPCs to play the game and face off against each other. The goal of this project is to take a deep dive into the aspects of AI being able to observe their surroundings, communicating between each other, remembering what they’ve learned, and changing their behavior based on what they know.

Weekly Breakdown

  • Week 1: Research
    • ​Research how AI observe scene elements in Unreal and what details the NPCs should be memorizing.
    • Implement a basic map to begin testing the NPCs in.
  • ​Week 2: NPCs
    • ​NPC Randomized movement towards goals. (try to get them to meander around similar to how a human player would)
    • Randomize the killer amongst the NPCs. (may be cut for scope's sake and to make it easier to know who are the killers each round)
    • Main Menu
  • ​Week 3: NPC Gameplay
    • ​Continue NPC movement.
    • NPC Killing
    • NPC Tasks
  • ​Week 4: Victories
    • ​Task win and Killer win conditions
    • End simulation screen
    • Begin observation implementation.
  • ​Weeks 5 and 6: NPC Observation
    • ​Implement Observation and Recording Observations for the NPCs. (begin with a simple implementation then build up more complex details)
    • Work on visually displaying what NPCs learn and know, and provide the user ways to check these details.
  • ​​Weeks 7 and 8: NPC Dialogue
    • ​Create basic discussion scene.
    • Work on NPCs typing out dialogue between each other in voting scene and recording observations learned from dialogue.
    • Continue to work on visually displaying what NPCs know.
  • ​Weeks 9 and 10: Voting and Wrap-up
    • ​NPC voting scene and voting victory
    • Expanding on AI reasoning if able to
    • Bug fixing

Visualization of Data

To help with debugging and showing observers what is being witnessed and learned in the simulation, these elements will be implemented to provide visualization of data:
  • A text box at the bottom left of the screen to give short updates on what has happens when they happen in the game
  • Buttons to click on to focus on a specific NPC and reveal what they "know".
    • Show a text list of what the focused NPC has seen and at what time stamp
    • Show the radius of vision for the NPC

Risks

  • Mitigated
    • ​Trying to get the NPCs to play the real Among Us game - This was dealt with by me deciding to create my own simple game similar to Among Us.
    • Trying to get the NPCs to play against human players - No humans will be needed to play against my NPCs in this game I make. This could be a stretch goal, but that's unlikely due to the amount of commands needed to be provided to a human player to deal getting information from NPCs being too many.
  • ​Still Possible
    • ​Game setup takes longer than expected - Ways to resolve this involve dropping randomization of the killers, removing the emergency button, using online art assets, and removing the recent events text box.
    • Dialogue between NPCs difficult to create - To reduce the time on this, the dialogue can be made simple and repetitive. Could also be made to be always true and always taken as fact by the NPCs. (Ex: Red says Green killed Blue, then the NPCs all instantly know Green is the killer.)

Full Powerpoint

Want to see the full Presentation Powerpoint? Click Here:
Picture
0 Comments
Forward>>

    Coleman Levy

    Welcome to my blog! 

    Archives

    August 2021
    July 2021
    June 2021
    May 2021

    Project

    All
    AI Social Deduction

    RSS Feed

  • Home
  • Unreal Engine
  • C++
  • Blog
  • About