Skip to content

[Feature] Languages#69

Open
Dirius77 wants to merge 13 commits intoTheDenSS14:masterfrom
Dirius77:languages-but-for-pring
Open

[Feature] Languages#69
Dirius77 wants to merge 13 commits intoTheDenSS14:masterfrom
Dirius77:languages-but-for-pring

Conversation

@Dirius77
Copy link

@Dirius77 Dirius77 commented Mar 6, 2026

About the PR

Added the ability for there to be languages, as well as a lot of mechanisms around them.
A person can either speak or not speak a language, but understanding comes in different levels.
Translation of a language can be based on those levels. Though there are some languages that are all or nothing.
Languages can also be related to each other, ever have that weird situation where a few words from another language seem familiar because of a different one you speak? It's that.
This PR adds a ton of example languages that I don't expect to actually keep, they're just there to demonstrate all the systems.

Why / Balance

Being able (or not) to speak different languages is an important and interesting part of people's characters and sets them apart. I wanted to work to create a system that was more well integrated with everything else in the codebase, while also allowing much greater flexibility and creativity in terms of what kind of languages can be created, and what they can do.

Technical details

Languages are their own entities. They are stored on a person and a lot of the events that modify chat messages are relayed to the languages they are actively speaking, or to the languages they understand in order to determine how their view of the speech works. Components on the languages, and by extension systems, are responsible for every property of the way a language behaves. This makes it easy to mix and match functionality and behavior for the languages. It also makes it easy to add new functionality to the languages, as a new system can simply add a component, and then do whatever it wants with the events that the language system generates.

Random pile go:

  • If a system completely removes the contents of a message going to a listener then the message is not displayed at all.
  • Language components that add to the description of a language in the UI (through ExaminedEvent) need to also listen for the startup event and trigger the language system to mark the language as updated so that the language UI actually knows it needs to get new information. They must also be networked.
  • The radio related functions have an option for forcing whatever is using them to acquire the default language, presently every device that speaks on the radio uses this because it doesn't make sense to try to talk on the radio without a language.
  • Animal 'accents' are still just accents, and they still replace all text, so giving an animal a language will not suddenly make it able to speak, unless you give it a language that doesn't care about accents. I should do something about Cogni.

Components:

  • Audible influences things like voice triggers and parrots that are 'listening' for a language.
  • ChatChannelWhitelist restricts which channels a language can be used on, like Local or Whisper. Telepathic uses this to stop people from whispering it.
  • ChildLanguage is just an indicator that the language is being added from some parent. The UI uses it.
  • Gestalt causes a language to be transmitted to every valid listening player on the server. This should almost always be combined with MinimumFluency or literally everyone will hear it.
  • GestaltHost isn't used for languages themselves, it's just a marker so Gestalts that require something to function aren't searching literally every entity.
  • LanguageCommunicator is the big one, it stores language entities for a speaker and keeps track of what language they're speaking.
  • LanguageFontSuppression marks an entity as not wanting to see custom language fonts.
  • LineOfSightLanguage makes a language perform a line of sight check in order to be understood.
  • MinimumFluency eats the message if fluency is below a certain amount, preventing it from being seen at all. Gestalts should use this unless you want to pollute everyone's brainspace.
  • RadioTransmittable marks a language as being able to be spoken on the radio.
  • ReplaceSpeakerName replaces the names of all speakers of this language with a string.
  • SpeechTransformable marks a language as having accents and other transformations applied to it.
  • SyllableScrambling is the default filter languages go to, replacing words with different syllables and handling fluency and the ability to speak some words.
  • TranslatedLanguage is a marker component for the UI to indicate a language is coming from a translator
  • UnconsciousLanguage makes it possible to speak a language while asleep, I think I forgot to show off this or use it.
  • UniversalLanguage this component doesn't go on a language, it goes on a speaker, and it will force them to have a language that makes them treat all other languages as perfectly fluent. This doesn't remove ALL language effects, as ones that don't care about fluency still apply.
  • VisualName causes the language to use a persons visual identity (like the way emotes works) instead of their usual one.
  • WhisperMuffle causes things between 2 and 5 tiles to be muffled. It can also be set to just eat the message entirely.

Scrambling and Understanding:
The Syllable scambler has a dictionary list of the most common 1000 words in the english language. These words will have their translations cached, and are also the ones most likely to be translated if someone has partial understanding of a language. The system caches individual words so that they can remain common between speech, but also caches entire messages, this is important so that the message doesn't need to be re-scrambled for every listener. Listeners with different fluencies will still cause scrambling, but worst case scenario that is five or six runs, and it's really not even that slow. The larger the caches are allowed to be (there's a CVar for this), the longer words translations will remain the same before they get bumped out of the cache. The caches are also ordered, so words that are used frequently will not get bumped out of the cache and will retain their translation, possibly through an entire round.

ComplexChatMessage:
This data type is passed all over the place and stores the difference between dialog and emotes for use in detailed messages. Anything that cares about the difference between works that might be 'spoken' or at least communicated in a language, instead of emotes, should make use of this. It's a slightly annoying struct but there are functions in SharedChatSystem for interacting with it, and all of the Language systems expect to ingest text in this form.

Media

smallerplease_000.mp4
smallerplease_001.mp4
smallerplease_002.mp4
smallerplease_003.mp4
smallerplease_004.mp4

Requirements

Breaking changes

Every single function that interacted with chat before has been marked Obsolete and replaced with a Language variant. I have fixed all the current cases where this occurs in the codebase. If you don't care about languages at all, it should be as simple as replacing whatever function or event you were using before with the Language version of it. If you do care, then you're welcome to use the extra fields to adjust how that system behaves. A good example of this in the current codebase if you want to look is the ParrotMemoryComponent and related systems.

All of the obsolete functions and types are marked to throw build errors if used, IMO this is a better option than allowing PRs and changes that rely on old chat code to just exist and silently slide into the codebase.

Ahem:

  • EntitySpokeEvent -> EntitySpokeLanguageEvent
  • HeadsetRadioReceiveRelayEvent -> HeadsetRadioReceiveLanguageRelayEvent
  • ListenAttemptEvent -> ListenLanguageAttemptEvent
  • ListenEvent -> ListenLanguageEvent
  • RadioReceiveEvent -> RadioReceiveLanguageEvent
  • SpeakAttemptEvent -> SpeakLanguageAttemptEvent
  • SurveillanceCameraSpeechSendEvent -> SurveillanceCameraSpeechLanguageSendEvent
  • TryVocalizeEvent -> TryVocalizeLanguageEvent
  • TelephoneMessageReceivedEvent -> TelephoneMessageReceivedLanguageEvent

UPSTREAM CHAT REWORK ISN'T REAL AND IT CAN'T HURT ME
(actually if upstream ever does a good rework making this system use it should be really easy because I just need to mark the new events as language relayed and then have the languages work on those instead of their own.)

Changelog
🆑

  • add: Added Languages!

@github-actions
Copy link
Contributor

github-actions bot commented Mar 6, 2026

RSI Diff Bot; head commit ce5df3b merging into 406b324
This PR makes changes to 1 or more RSIs. Here is a summary of all changes:

Resources/Textures/_DEN/Objects/Devices/translator.rsi

State Old New Status
icon Added
translator Added

Edit: diff updated after ce5df3b

@sleepyyapril
Copy link
Contributor

test fails

@github-actions github-actions bot added the size/M label Mar 6, 2026
@Dirius77
Copy link
Author

Dirius77 commented Mar 6, 2026

Tests should all be passing now. Though I did discover a Heisentest on my client in the process. LogWindowTest failing is not my fault (if it does).

@glassgardens
Copy link

a concern i've heard a lot about fonts is accessibility insofar as SS14 is a very visually overstimulating game, & the varied fonts add a lot more noise into the mix. imo there should be an option to disable all language fonts as well as just the ones the player understands

@Dirius77
Copy link
Author

Dirius77 commented Mar 6, 2026

I could see the argument for that, though I do think the problem is a bit exasperated in the older codebase by the selection of fonts and sizes making messages really disrupt the flow of chat. I think if more care is put into making sure that each language occupies the same amount of space in the chatbox that will be less of an issue. Languages you don't understand are essentially just noise as well. That being said, I guess it is pretty easy to implement and there's no real harm in more options.

@Dirius77
Copy link
Author

Dirius77 commented Mar 6, 2026

Language Font hiding is now a dropdown option:
image

@glassgardens
Copy link

Yay!

@Dirius77
Copy link
Author

Dirius77 commented Mar 6, 2026

Ideally we just tailor the language fonts in such a way that people don't feel it's necessary to turn them off, but having the option never hurts.

@Dirius77 Dirius77 mentioned this pull request Mar 6, 2026
2 tasks
@honeyed-lemons
Copy link
Contributor

i'd also appreciate the ability to turn off custom language colors clientside
I'm not a fan of the different colors they can have PERSONALLY and more options is good :)

@msTheowo
Copy link

msTheowo commented Mar 7, 2026

I just quickly tested things and entities without any languages are completely unable to talk. Mice can't "Fwiep!" and silicons can't talk at all. Also languages should probably be stored in the mind isntead of the body.

@portfiend
Copy link
Contributor

agree with "languages should be mind instead of body", for example if we readd mindswapping

some cases id like to make sure are accounted for and working correctly:

  • medibots saying "hold still"
  • players transformed by an artifact and other polymorph cases
  • accent clothing (french beret etc)

ill try to think of more later

@Dirius77
Copy link
Author

Dirius77 commented Mar 7, 2026

Entities that can't speak just need some sort of language added to them. That, or the CVar set that forces them to be given one when they fail to speak.

Medibots and other bot and borgs need a language added to them, I can do that.

Languages following the mind is a different thing, and I'll need to poke at the places that move minds around. I am not sure if I can/want to attach it directly to the mind, since it leads to different logic for anything that has a mind vs. doesn't speaking. It might be easiest to just make sure that the mind transfer code brings the languages with it if the transformation in question is one where that makes sense.

@portfiend
Copy link
Contributor

that makes sense to me!

@honeyed-lemons
Copy link
Contributor

honeyed-lemons commented Mar 7, 2026

Could have a separate component for holding languages on entities without minds, that then gets temporarily added onto a mind when it enters the body, like, InnateLanguageComponent or smth?
This will save you a lot of headache with stuff like borging, mind transfer, polymorph, etc

@portfiend
Copy link
Contributor

if i were to also suggest a minor QOL feature: visual distinction between words that are/aren't understood by the player? for example: "Aa of it will be tangoo to bang on tingee other eeooee." could be a color or italics, or maybe applying the font to the untranslated words?

@sleepyyapril
Copy link
Contributor

if i were to also suggest a minor QOL feature: visual distinction between words that are/aren't understood by the player? for example: "Aa of it will be tangoo to bang on tingee other eeooee." could be a color or italics, or maybe applying the font to the untranslated words?

i would wait on my future wizden PR for this part

@sleepyyapril
Copy link
Contributor

i'm planning on changing the accent setting and such into an event call

@Dirius77
Copy link
Author

Dirius77 commented Mar 7, 2026

The languages code is pretty divorced from the chat code at this point, unless you're killing TransformSpeechEvent it likely won't impact it too much. I don't know if the accent changes would impact that. The more difficult part of that would be finding an effect that can apply that generically works with all the languages. Sign is entirely italic for example anyway. I guess I could just make it a defined part of the SyllableScramblerComponent so that it can be set per-language.

@Dirius77
Copy link
Author

Dirius77 commented Mar 7, 2026

Languages following the mind I think is a much more complex discussion that needs to be looked at, for example, things like Borg chassis inherently knowing Binary, or Xenomorphs being able to communicate with their hivemind (if we keep that, literally all of the languages in this PR are just examples). If you get pushed into the body of a xeno, or borged, you should acquire those if you don't have them, and you shouldn't retain them if you leave. Other gestalts that I've seen people asking for, like the ones for diona or shadowkin, are more tied to the species than the mind, and I don't think that those should follow you between bodies...

Actually I guess the solution is simply to add a component to the language that defines whether it's attached to the body or the mind and have the mind swapping code respect that.

@Dirius77
Copy link
Author

Dirius77 commented Mar 8, 2026

image image
2026-03-07.23-09-54.mp4
2026-03-07.23-34-37.mp4

There's mice, bots, and borgs. I just gave animals a language that doesn't do anything special beyond being a generic audible language, then they rely on their accents to make them not understandable.

@Dirius77
Copy link
Author

Dirius77 commented Mar 8, 2026

Languages can now be configured to follow the user through a polymorph, this can be done on a per language basis with a component.

2026-03-08.00-29-59.mp4

@portfiend
Copy link
Contributor

if i can pitch something else: could you add a cvar to enable/disable the language system entirely? i.e., if you disabled languages, it should look like languages do not exist entirely to the user - no UI for it, language-related chat events shouldn't trigger, etc

@Dirius77
Copy link
Author

Dirius77 commented Mar 9, 2026

if i can pitch something else: could you add a cvar to enable/disable the language system entirely? i.e., if you disabled languages, it should look like languages do not exist entirely to the user - no UI for it, language-related chat events shouldn't trigger, etc

I'm assuming you're meaning for this from a server side perspective where it's just completely turned off for all the clients connected as well.

I can look into doing this, though doing it in a way that also preserves the other chat features that are currently rather integrated into languages (namely the detailed speech thing) would be a decent amount of work.

The other loss would be having to remove [Obsolete(true)] from the legacy chat code as an indicator that it should be switched over to the new system, but that's not too big of a deal.

@portfiend
Copy link
Contributor

i suppose you could probably accomplish the same thing by having all chat messages route through the default / "basic" language if disabled and simply making sure the UI doesn't appear to the client? if it's a lot of work im not gonna make you do it dont worry. its just something that came to mind because certain other servers with their own downstreams may be also interested in the feature

@Dirius77
Copy link
Author

I think a toggle where the system just hides existing (forces every entity to use a language that is invisible and understood by everyone and then turns off the UI) would definitely be the easiest to implement, even if it doesn't feel that 'clean'. When I get some time I'll take a look through and decide which I like best for it.

@Dirius77 Dirius77 force-pushed the languages-but-for-pring branch from a987bcf to 36c8390 Compare March 10, 2026 23:03
@Dirius77
Copy link
Author

There is now a CVar that "turns off" languages. Now, keep in mind it doesn't actually disable languages completely, and it doesn't remove language related events, because functionality like accents, name overrides on voice changers, and typing indicators rely on the language systems now. What it does do, is causes every entity that tries to speak to acquire and use a Default language that has no translation functionality and behaves like basic speech and is understood by everyone (technically, things with no languages don't understand it, the language just has no obfuscation defined, so the complete lack of understanding version of the language is the same as the understood version). Either way, languages disabled should function the same as not having them barring a few things:

  • Disabling Languages DOES NOT disable detailed speech, such as the ability to emote and speak with the emotes avoiding accents, and the ability to emote over the radio.
  • Messages may be formatted slightly differently because of the complex speech system related to the above. (This should be relatively insignificant)

@Dirius77
Copy link
Author

Right here's a video of the above:

2026-03-10.18-57-19.mp4

@Dirius77
Copy link
Author

And here's a CVar for removing the detailed speech formatting if you want it.

@Dirius77 Dirius77 force-pushed the languages-but-for-pring branch from fefb486 to 6f80f7d Compare March 11, 2026 23:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants