Sony has been fixated on marketing this unique controller for immersion: selling players on the experience of feeling their bow being drawn taut, or the tippy tapping of raindrops against their palms. When it comes to accessibility for disabled players, there’s so much potential that lies in the existence of these intricate haptics. It feels, though, as if the focus on immersion may be seeing developers forget about providing a balance, or optional choice, of immersion and functionality. I’ll jump straight into how the controller’s haptics can be useful for accessibility — the adaptive triggers are a story for another day. I’m Deaf and I wear hearing aids, yet I still miss key sounds, or I fail to understand the newfangled 3D audio. The haptics could guide me with directionality, they could make me feel immersed in a world to set the scene, and they could highlight a successful puzzle completion. Additionally, they could help those who are blind or have low vision, by informing players of interactive elements, making menu navigation more tactile, and also warning of nearby dangers such as enemies or ledges. PlayStation studios have been showing how haptics can be used for both immersion and functionality and how to use them well. This probably comes as no surprise as the company and its first-party studios have been making strides with accessibility, from The Last of Us Part 2, Able@PS, to the dedicated accessibility website. Meanwhile, other studios making use of the haptic technology seem to just go a bit wild with immersion. Spider-Man: Miles Morales is a wonderful example of PlayStation doing functional haptics right. Haptics in this title provide directional feedback for nearby clues and incoming damage. The PS5 collection of Uncharted 4: Legacy of Thieves uses haptics to convey to the player that a puzzle has been completed correctly while blending it nicely into the world design. Then there are titles such as Ratchet and Clank: Rift Apart that introduce two types of feedback. Functional haptics provides players with the tactile information that pertains to any useful cues related to progression. Meanwhile, Experimental haptics offers a more immersive experience in which everything around the player is splurged into the player’s palms. More importantly, the game offered an intensity slider for those who may be sensitive to strong vibrations. Horizon Forbidden West took intensity adjustments a step further by offering sliders for various haptic types such as UI, traversal, combat, environmental, and others. A wonderful choice for those who prefer functional cues over the blockbuster feeling of foliage rubbing against their arm. For good reason as well, as there have been recent games making use of the DualSense haptics but having a stronger focus on immersion. Ghostwire: Tokyo is a game that utilizes haptics to provide immersive feedback for elements such as casting powers, gathering souls, jumping, and even rainfall. Players can only toggle this whole experience on or off, and with so much tactile noise hitting the palm it can be rather jarring. As an example, while fighting a tall spectral being that shares resemblances to Lady Dimitrescu, I found myself feeling overwhelmed with how much information was rushing to my hands. I started to feel an uneven tapping across the controller but couldn’t figure out if another enemy was off-frame or if I’d only just discovered a directional cue. It turns out it was just the rain and some of the spells I’d been casting in tandem. I was left longing for useful features like directional indications, low health warnings, or even a warning for incoming attacks so I could finally block on time. Returnal is another example that that bombards rain haptics across the controller’s body. Of course, immersive features were what the team wanted to have. The trouble is that rain haptics on top of the other immersive haptics such as weapons and movement types stand that chance of burying critical cues that are available. With no way to isolate features, players are forced into an immersive experience that may not be what they require. It’s clear that the DualSense controller is an exciting new concept, it’s even been confirmed that the upcoming PS VR 2 headset will feature haptic technology which could be very useful for d/Deaf and hard of hearing players and those with visual impairments. However, with the examples of this mechanic being used inaccessibly already with the controller, developers need to start seeing the benefits of providing functional uses and granular adjustments. They can do that by taking lead from PlayStation Studios’ titles that do it well and provide customization, instead of forcing the player to deal with the default way of playing, which may be uncomfortable. Why should they? Well, because players are diverse. Some may not enjoy the tippy tapping of their toes landing after every jump, some may enjoy the rumble of a giant beast roaring at their face. Some may just want to understand elements of the game better without the blockbuster experience. As it happens, for the DualSense, people with disabilities have had concerns since the announcement for many reasons. Those who struggle with sensitivity, chronic pain, and overstimulation may find it more comfortable to only have hints instead of an entire city crumbling at their hands. d/Deaf and hard-of-hearing players may want that immersive experience to better embed themselves in the world. However, they may want granular adjustments so that critical information is not lost. Those with low vision or blindness could make use of directional cues to help navigate the world and understand where danger is. The immersive experience DualSense provides is wonderful. It’s a gateway to more detailed gaming experiences that can really bring digital worlds to life. However, with this new technology, and with any new technology, developers need to adapt, recognize barriers, and not get lost in the excitement of new features that could lead to exclusion.