Conversation
|
test fails |
|
Tests should all be passing now. Though I did discover a Heisentest on my client in the process. LogWindowTest failing is not my fault (if it does). |
|
a concern i've heard a lot about fonts is accessibility insofar as SS14 is a very visually overstimulating game, & the varied fonts add a lot more noise into the mix. imo there should be an option to disable all language fonts as well as just the ones the player understands |
|
I could see the argument for that, though I do think the problem is a bit exasperated in the older codebase by the selection of fonts and sizes making messages really disrupt the flow of chat. I think if more care is put into making sure that each language occupies the same amount of space in the chatbox that will be less of an issue. Languages you don't understand are essentially just noise as well. That being said, I guess it is pretty easy to implement and there's no real harm in more options. |
|
Yay! |
|
Ideally we just tailor the language fonts in such a way that people don't feel it's necessary to turn them off, but having the option never hurts. |
|
i'd also appreciate the ability to turn off custom language colors clientside |
|
I just quickly tested things and entities without any languages are completely unable to talk. Mice can't "Fwiep!" and silicons can't talk at all. Also languages should probably be stored in the mind isntead of the body. |
|
agree with "languages should be mind instead of body", for example if we readd mindswapping some cases id like to make sure are accounted for and working correctly:
ill try to think of more later |
|
Entities that can't speak just need some sort of language added to them. That, or the CVar set that forces them to be given one when they fail to speak. Medibots and other bot and borgs need a language added to them, I can do that. Languages following the mind is a different thing, and I'll need to poke at the places that move minds around. I am not sure if I can/want to attach it directly to the mind, since it leads to different logic for anything that has a mind vs. doesn't speaking. It might be easiest to just make sure that the mind transfer code brings the languages with it if the transformation in question is one where that makes sense. |
|
that makes sense to me! |
|
Could have a separate component for holding languages on entities without minds, that then gets temporarily added onto a mind when it enters the body, like, InnateLanguageComponent or smth? |
|
if i were to also suggest a minor QOL feature: visual distinction between words that are/aren't understood by the player? for example: "Aa of it will be tangoo to bang on tingee other eeooee." could be a color or italics, or maybe applying the font to the untranslated words? |
i would wait on my future wizden PR for this part |
|
i'm planning on changing the accent setting and such into an event call |
|
The languages code is pretty divorced from the chat code at this point, unless you're killing TransformSpeechEvent it likely won't impact it too much. I don't know if the accent changes would impact that. The more difficult part of that would be finding an effect that can apply that generically works with all the languages. Sign is entirely italic for example anyway. I guess I could just make it a defined part of the SyllableScramblerComponent so that it can be set per-language. |
|
Languages following the mind I think is a much more complex discussion that needs to be looked at, for example, things like Borg chassis inherently knowing Binary, or Xenomorphs being able to communicate with their hivemind (if we keep that, literally all of the languages in this PR are just examples). If you get pushed into the body of a xeno, or borged, you should acquire those if you don't have them, and you shouldn't retain them if you leave. Other gestalts that I've seen people asking for, like the ones for diona or shadowkin, are more tied to the species than the mind, and I don't think that those should follow you between bodies... Actually I guess the solution is simply to add a component to the language that defines whether it's attached to the body or the mind and have the mind swapping code respect that. |
|
Languages can now be configured to follow the user through a polymorph, this can be done on a per language basis with a component. 2026-03-08.00-29-59.mp4 |
|
if i can pitch something else: could you add a cvar to enable/disable the language system entirely? i.e., if you disabled languages, it should look like languages do not exist entirely to the user - no UI for it, language-related chat events shouldn't trigger, etc |
I'm assuming you're meaning for this from a server side perspective where it's just completely turned off for all the clients connected as well. I can look into doing this, though doing it in a way that also preserves the other chat features that are currently rather integrated into languages (namely the detailed speech thing) would be a decent amount of work. The other loss would be having to remove |
|
i suppose you could probably accomplish the same thing by having all chat messages route through the default / "basic" language if disabled and simply making sure the UI doesn't appear to the client? if it's a lot of work im not gonna make you do it dont worry. its just something that came to mind because certain other servers with their own downstreams may be also interested in the feature |
|
I think a toggle where the system just hides existing (forces every entity to use a language that is invisible and understood by everyone and then turns off the UI) would definitely be the easiest to implement, even if it doesn't feel that 'clean'. When I get some time I'll take a look through and decide which I like best for it. |
a987bcf to
36c8390
Compare
|
There is now a CVar that "turns off" languages. Now, keep in mind it doesn't actually disable languages completely, and it doesn't remove language related events, because functionality like accents, name overrides on voice changers, and typing indicators rely on the language systems now. What it does do, is causes every entity that tries to speak to acquire and use a
|
|
Right here's a video of the above: 2026-03-10.18-57-19.mp4 |
|
And here's a CVar for removing the detailed speech formatting if you want it. |
fefb486 to
6f80f7d
Compare




About the PR
Added the ability for there to be languages, as well as a lot of mechanisms around them.
A person can either speak or not speak a language, but understanding comes in different levels.
Translation of a language can be based on those levels. Though there are some languages that are all or nothing.
Languages can also be related to each other, ever have that weird situation where a few words from another language seem familiar because of a different one you speak? It's that.
This PR adds a ton of example languages that I don't expect to actually keep, they're just there to demonstrate all the systems.
Why / Balance
Being able (or not) to speak different languages is an important and interesting part of people's characters and sets them apart. I wanted to work to create a system that was more well integrated with everything else in the codebase, while also allowing much greater flexibility and creativity in terms of what kind of languages can be created, and what they can do.
Technical details
Languages are their own entities. They are stored on a person and a lot of the events that modify chat messages are relayed to the languages they are actively speaking, or to the languages they understand in order to determine how their view of the speech works. Components on the languages, and by extension systems, are responsible for every property of the way a language behaves. This makes it easy to mix and match functionality and behavior for the languages. It also makes it easy to add new functionality to the languages, as a new system can simply add a component, and then do whatever it wants with the events that the language system generates.
Random pile go:
Components:
Audibleinfluences things like voice triggers and parrots that are 'listening' for a language.ChatChannelWhitelistrestricts which channels a language can be used on, like Local or Whisper. Telepathic uses this to stop people from whispering it.ChildLanguageis just an indicator that the language is being added from some parent. The UI uses it.Gestaltcauses a language to be transmitted to every valid listening player on the server. This should almost always be combined withMinimumFluencyor literally everyone will hear it.GestaltHostisn't used for languages themselves, it's just a marker so Gestalts that require something to function aren't searching literally every entity.LanguageCommunicatoris the big one, it stores language entities for a speaker and keeps track of what language they're speaking.LanguageFontSuppressionmarks an entity as not wanting to see custom language fonts.LineOfSightLanguagemakes a language perform a line of sight check in order to be understood.MinimumFluencyeats the message if fluency is below a certain amount, preventing it from being seen at all. Gestalts should use this unless you want to pollute everyone's brainspace.RadioTransmittablemarks a language as being able to be spoken on the radio.ReplaceSpeakerNamereplaces the names of all speakers of this language with a string.SpeechTransformablemarks a language as having accents and other transformations applied to it.SyllableScramblingis the default filter languages go to, replacing words with different syllables and handling fluency and the ability to speak some words.TranslatedLanguageis a marker component for the UI to indicate a language is coming from a translatorUnconsciousLanguagemakes it possible to speak a language while asleep, I think I forgot to show off this or use it.UniversalLanguagethis component doesn't go on a language, it goes on a speaker, and it will force them to have a language that makes them treat all other languages as perfectly fluent. This doesn't remove ALL language effects, as ones that don't care about fluency still apply.VisualNamecauses the language to use a persons visual identity (like the way emotes works) instead of their usual one.WhisperMufflecauses things between 2 and 5 tiles to be muffled. It can also be set to just eat the message entirely.Scrambling and Understanding:
The Syllable scambler has a dictionary list of the most common 1000 words in the english language. These words will have their translations cached, and are also the ones most likely to be translated if someone has partial understanding of a language. The system caches individual words so that they can remain common between speech, but also caches entire messages, this is important so that the message doesn't need to be re-scrambled for every listener. Listeners with different fluencies will still cause scrambling, but worst case scenario that is five or six runs, and it's really not even that slow. The larger the caches are allowed to be (there's a CVar for this), the longer words translations will remain the same before they get bumped out of the cache. The caches are also ordered, so words that are used frequently will not get bumped out of the cache and will retain their translation, possibly through an entire round.
ComplexChatMessage:
This data type is passed all over the place and stores the difference between dialog and emotes for use in detailed messages. Anything that cares about the difference between works that might be 'spoken' or at least communicated in a language, instead of emotes, should make use of this. It's a slightly annoying struct but there are functions in SharedChatSystem for interacting with it, and all of the Language systems expect to ingest text in this form.
Media
smallerplease_000.mp4
smallerplease_001.mp4
smallerplease_002.mp4
smallerplease_003.mp4
smallerplease_004.mp4
Requirements
Breaking changes
Every single function that interacted with chat before has been marked Obsolete and replaced with a Language variant. I have fixed all the current cases where this occurs in the codebase. If you don't care about languages at all, it should be as simple as replacing whatever function or event you were using before with the Language version of it. If you do care, then you're welcome to use the extra fields to adjust how that system behaves. A good example of this in the current codebase if you want to look is the ParrotMemoryComponent and related systems.
All of the obsolete functions and types are marked to throw build errors if used, IMO this is a better option than allowing PRs and changes that rely on old chat code to just exist and silently slide into the codebase.
Ahem:
EntitySpokeEvent->EntitySpokeLanguageEventHeadsetRadioReceiveRelayEvent->HeadsetRadioReceiveLanguageRelayEventListenAttemptEvent->ListenLanguageAttemptEventListenEvent->ListenLanguageEventRadioReceiveEvent->RadioReceiveLanguageEventSpeakAttemptEvent->SpeakLanguageAttemptEventSurveillanceCameraSpeechSendEvent->SurveillanceCameraSpeechLanguageSendEventTryVocalizeEvent->TryVocalizeLanguageEventTelephoneMessageReceivedEvent->TelephoneMessageReceivedLanguageEventUPSTREAM CHAT REWORK ISN'T REAL AND IT CAN'T HURT ME
(actually if upstream ever does a good rework making this system use it should be really easy because I just need to mark the new events as language relayed and then have the languages work on those instead of their own.)
Changelog
🆑