Generative AI for Video Games

nearmejobs.eu

Generative AI is one of the most promising methods for achieving the long-standing goal of Non-Player Character (NPC) imitating particular playing behaviors which are represented as rules, rewards, or human demonstrations in modern video games. However, directly applying existing generative AI techniques to video games can be challenging. First, the generative AI targets at learning a distribution from offline data by iteratively minimizing a proxy measure, e.g., a loss function. This proxy measure, however, may not exactly describe the target and desired behavior (a.k.a value misalignment) for the NPC to imitate. The resulting model, though possibly optimal w.r.t. the proxy measure, can lead to unnatural or unhuman-like behaviors for NPC. Second, the modern video games are rich in different interaction modalities, various task settings, and diversified playing roles. The data inputs for a generative model could thus be multi-modal, multi-task and multi-embodiment, which poses a great challenge for Generative AI as most generative models can only capture the distribution from a single modality. 

Research objectives include: (1) developing human-centric generative models, i.e., the generated distribution in Generative AI will strike a balance between the target distribution (i.e., optimality) and the distribution human might incur, (i.e., naturalness/human-likeness) [1]. We propose the ad-hoc teaming between generative AI and the domain experts which consists of two steps: first, the generative AI learns a model from offline human demonstrations with a set of predefined proxy measures; second, the domain experts iteratively rank the quality of learned models based on the naturalness of the generated behaviors. Through this learning-correction loop, the generated behavior for NPC is guaranteed to be natural. (2) developing generative models with heterogenous inputs, i.e., the generated distribution should capture the underlying distributions and the joint distributions of data from different modalities [2]. We propose a generative AI that accepts both text and visual demonstrations for NPC to produce the target text outputs and the desired behavior. We will develop a unified generative model for multi-modal, multi-task and multi-embodiment inputs.

Entry requirements

The standard academic entry requirement for this PhD is an upper second-class (2:1) honours degree in a discipline directly relevant to the PhD (or international equivalent) OR any upper-second class (2:1) honours degree and a Master’s degree at merit in a discipline directly relevant to the PhD (or international equivalent).

How to apply

You will need to submit an online application through our website here: https://uom.link/pgr-apply

When you apply, you will be asked to upload the following supporting documents: 

• Final Transcript and certificates of all awarded university level qualifications

• Interim Transcript of any university level qualifications in progress

• CV

• You will be asked to supply contact details for two referees on the application form (please make sure that the contact email you provide is an official university/ work email address as we may need to verify the reference)

• Supporting statement: A one or two page statement outlining your motivation to pursue postgraduate research and why you want to undertake postgraduate research at Manchester, any relevant research or work experience, the key findings of your previous research experience, and techniques and skills you’ve developed. (This is mandatory for all applicants and the application will be put on hold without it.

• English Language certificate (if applicable). If you require an English qualification to study in the UK, you can apply now and send this in at a later date. 

Before you apply

We strongly advise you to contact the supervisors before you apply.

Funding

At The University of Manchester, we offer a range of scholarships, studentships and awards at university, faculty and department level, to support both UK and overseas postgraduate researchers.

For more information, visit our funding page or search our funding database for specific scholarships, studentships and awards you may be eligible for.

Equality, diversity and inclusion is fundamental to the success of The University of Manchester, and is at the heart of all of our activities. We know that diversity strengthens our research community, leading to enhanced research creativity, productivity and quality, and societal and economic impact. We actively encourage applicants from diverse career paths and backgrounds and from all sections of the community, regardless of age, disability, ethnicity, gender, gender expression, sexual orientation and transgender status.

We also support applications from those returning from a career break or other roles. We consider offering flexible study arrangements (including part-time: 50%, 60% or 80%, depending on the project/funder). 

To help us track our recruitment effort, please indicate in your email – cover/motivation letter where (nearmejobs.eu) you saw this posting.

Share

BUTIKSMEDARBETARE COACH 80% UMEÅ

Job title: BUTIKSMEDARBETARE COACH 80% UMEÅ Company Lager 157 Job description till våra kunder. JOIN…

15 mins ago

Trading Assistant – Local

Job title: Trading Assistant - Local Company Sainsbury's Job description Job Description: Love food and…

18 mins ago

RGN required for agency line of work in a Care of the older persons facility based in Co Wicklow

Job title: RGN required for agency line of work in a Care of the older…

21 mins ago

Senior Logistics Coordinator

Job title: Senior Logistics Coordinator Company Dynacare Job description Description Position at Dynacare Job Posting:…

26 mins ago

Finance Controller

Job title: Finance Controller Company Trane Technologies Job description and local statutory and tax requirement.…

30 mins ago

Konferensvärd

Job title: Konferensvärd Company Hotel C Stockholm Job description to take your next career leap…

32 mins ago
For Apply Button. Please use Non-Amp Version

This website uses cookies.