New Audio API [Beta]: Elevate Sound and Voice in Your Experiences

Can I have a follow up to this? At least some answer to my question. please :pray:

5 Likes

If you’re running a “Local Server” playtest with 2 or more clients, the clients won’t be able to hear one another, unfortunately. No matter if you locally enable or server-side enable the “Active” property for each clients’ AudioDeviceInput, and regardless if the IsReady property is replicated as being “true” between the clients in the playtest, AND even if each client is able to simultaneously transmit audio, it’s not picked up by any of the other clients in the local playtest.

I’m not sure if that’s an unintended issue specifically with local playtests, where the Voice Chat UI and the “Input Device” option within the Roblox Settings Menu are not present, whereas it’s available in Team Tests and in live servers.


While it’s still possible to use Voice Chat in 1-player solo playtests, it’s very unfortunate that it doesn’t work properly in multi-client local playtests since that adds extra layers of friction for solo developers wanting to test out Voice Chat features that typically require separate accounts with Voice Chat enabled to test.

Thus far, for all of the multiplayer Voice Chat testing I’ve done with the new Audio API, I’ve had to publish the game, leave Roblox Studio, then join a live server of that game from my computer on one account and from my phone on another account.


I had wondered if it would be possible to seamlessly host local playtests with multiple clients for testing out “multiplayer Voice Chat situations” ever since the “Chat with Voice Developer Beta” was announced in 2021:

Although the primary focus of my question back then was wondering if developers would be able to locally test Voice Chat functionality whether or not they verified their ID, it’s really unfortunate that it’s not currently possible to simulate having multiple clients communicate between one another with Voice Chat in Roblox Studio, even if the individual account being used for the multi-client local playtest is ID verified and has Voice Chat enabled.

8 Likes

Right so after playing around with the new audio API’s, i have come to the conclusion that the new systems right now are very hard to work with overall. It feels as if the new audio api was more designed on the idea that you’ll just set up some audio source and never touch it again past just deleting it once its no longer needed. Swapping in and out audio effects is unreasonably inefficient as were pretty much forced to constantly cycle through every wire and whatever else other instances needed just to make the entire process work. One idea i have to potentially fix this issue would be some sort of “Wire Hub” instance. Basically it would be an instance that lets you hook up multiple input wires and a single output wire just so you dont need to constantly rechain many other wire instances.

Overall i think the new system is great, it just feels as if its half complete (or at least half usable).

(also proper pitch shifting update yes?)

3 Likes

in my game i had this enabled out of curiosity

I’ve come across a bug where overtime players voices will be heard globally.

It’s a bit of a silly bug and I’m not complaining about it (but I know it can become annoying). I’m not sure if this has been reported yet & the voices can still be heard when the Roblox volume slider is at 0 (like in the video)

6 Likes

When we enabled this in our game we had users report they couldnt access it because it kept crashing. They said they had no error and it just closed the player window. Im not sure if its like this for all the users that were crashing, but some reported that it was not crashing in the windows store app, but was in the normal pc app.

2 Likes

Is push to talk going to be added as said on this post:

Also, is there going to be a way to disable the default proximity chat, for example if I wanted to make a team voice chat system.

4 Likes

Has anyone got AudioAnalyzer to work when connected to an AudioListener?
It seems to always return 0s and never pick up the audio, even if everything is properly wired up and audio is being picked up by the listener. I’ve also tried hooking the listener to a fader and the analyzer to the fader, but still no results

3 Likes

might just be because audioanalyzers are disabled right now, they’ll likely make an announcement when they’re back up again

2 Likes

Turning off the EnableDefaultVoice property of the VoiceChatService is the way to disable proximity-based Voice Chat by default, since that property is what automatically creates the AudioEmitter and AudioListener within each player’s Character model (which are two of the primary instances that make it possible for players to emit / hear proximity-based Voice Chat audio when using the new Audio API).

I had initially thought there was going to be a built-in feature for it, too, but considering how several other features that were mentioned alongside that required scripts to interact directly with the new Audio API (e.g. modifying voice and walkie-talkies) it seems like push-to-talk might be a feature we have to code ourselves.


Fortunately, it appears that push-to-talk is fairly easy to implement :smile: Here’s a pretty barebones version I created for enforcing push-to-talk by locally enabling / disabling the Muted property of the player’s AudioDeviceInput depending on whether or not the player is holding down the specified keybind.

(Note that there’s more that would need to be added to this into this in order to make it mobile compatible, along with other quality of life features such as making it possible for players to specify their own keybind).

Push-to-talk Example

-- LocalScript in StarterPlayerScripts
local UserInputService = game:GetService("UserInputService")
local Players = game:GetService("Players")
local player = Players.LocalPlayer

local audioDeviceInput = player:WaitForChild("AudioDeviceInput") -- If you manually created an AudioDeviceInput with a different name, make sure to update this to the new name
audioDeviceInput.Muted = true -- Muting the AudioDeviceInput immediately since it starts out as being unmuted, by default

local pushToTalkKeybind = Enum.KeyCode.C -- Define the key that players have to hold down while talking in order to be heard by other players


UserInputService.InputBegan:Connect(function(inputObject, gameProcessedEvent) -- Function activated when the player interacts with the mouse, keyboard, etc.
	if inputObject.UserInputType == Enum.UserInputType.Keyboard and gameProcessedEvent == false then -- Checks if the type of input was from a keyboard and makes sure that the player wasn't interacting with UI at the time (such as typing in a TextBox)
		
		local keycode = inputObject.KeyCode -- Refers to the KeyCode of the InputObject, which will be used to check which key on the keyboard the player pressed
		
		if keycode == pushToTalkKeybind then -- If the key that the player pressed matches the "pushToTalkKeybind" defined at the top of the LocalScript, then...
			audioDeviceInput.Muted = false -- The LocalScript unmutes the AudioDeviceInput since the player is holding down the push-to-talk key
		end
	end
end)

UserInputService.InputEnded:Connect(function(inputObject, gameProcessedEvent) -- Function activated when the player stops interacting with the mouse, keyboard, etc.
	if inputObject.UserInputType == Enum.UserInputType.Keyboard and gameProcessedEvent == false then
		
		local keycode = inputObject.KeyCode
		
		if keycode == pushToTalkKeybind then -- If the player let go of the push to talk key, then...
			audioDeviceInput.Muted = true -- The LocalScript mutes the AudioDeviceInput since the player is no longer holding down the push-to-talk key
		end
	end
end)
6 Likes

Hey purpledanx; we made the GetConnectedWires method accessible to plugins for starters – did you have a use-case in mind for traversing the audio graph at runtime?

2 Likes

Hey StrongBigeMan9, we just updated the docs for AudioDeviceInput today – the AccessType property is intended to pair with SetUserIdAccessList

3 Likes

Will there ever be like Nodes? Where you can connect wires to a Node, and then connect to the Node to all the speakers or effects? I find it very bad to connect wires to a Emitter, Use a listener, and then wire all speakers around the map to this listener.

For example. Let’s say your making an intercom system and you want to make a cool sound before relaying the Audio. You could connect an audioplayer to the Node, play the sound, and then connect the players microphone to the Node.

1 Like

I have a dynamic reverb system that gets every audio emitter and traces back to the audio modifiers to changes the values of the fader, reverb, gain, etc. So the only way I could trace back would be to find the connected wires.

3 Likes

Thanks for updating it and letting me know! The documentation provided there mostly clears up the questions I had.

I had actually been updating my original post for the past half an hour when I first noticed the documentation was updated but realized it doesn’t appear to answer all the questions I had.

I’ll quote the sections of my original post that describe behavior with the AccessType property that appear unintentional, given what is documented about it at the moment:

2 Likes

Ahh I see – for the time being you might be able to use CollectionService/tags to speed up lookup, but that’s not as general; I can see how GetConnectedWires makes this nicer.

We’ll discuss and get back

6 Likes

In terms of moderation, how will users go about reporting people who say, use a broadcasting system, where their user isn’t shown but they are speaking inappropriate things.

3 Likes

Hey XenoDenissBoss1,

Swapping in and out audio effects is unreasonably inefficient as were pretty much forced to constantly cycle through every wire and whatever else other instances needed just to make the entire process work. One idea i have to potentially fix this issue would be some sort of “Wire Hub” instance. Basically it would be an instance that lets you hook up multiple input wires and a single output wire just so you dont need to constantly rechain many other wire instances.

You can use AudioFaders as hub nodes for organization; for example accepting many input streams and producing one output stream. With .Volume set to 1, this does not modify the input streams at all – does that help?

4 Likes

Since player voices are networked, changing AccessType or calling SetUserIdAccessList locally is not guaranteed to work well – however, that server-side update behavior does not sound intended; we’ll look into it

3 Likes

We need like a settable table value for players the Emitter will ignore so we don’t hear our own voice.

3 Likes

I’m turning on the Audio API stuff, and i’ll share my thoughts once i get a grip of how it works

2 Likes