Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use iDrive Knob to send keyboard commands within Screen Mirroring for Android Auto on casting phone #28

Open
r3dalk opened this issue Jun 27, 2024 · 4 comments

Comments

@r3dalk
Copy link

r3dalk commented Jun 27, 2024

Hold your horses, I know the quick answer is AA is not supported, but I have been fiddling around with some things.

First of all, this is my first time opening an issue on a open/pubic project so apologies in advance for any mistakes. Also, your app literally saved me dropping an absurd amount of money on an iPhone, so I'm really, really thankful for that!

I'll detail what I've tried and achieved, and then explain the issue/request I'm making, to understand wether it's achievable (and I'm hoping it is).

I have a 2021 BMW 2 Series (F22) with what is reported through AAiDrive as a NBTevo_ID5, and have been using the screen_mirror plugin since I got the car to stream Waze, running the AAiDrive app on a Samsung S23 Ultra with the screen_mirror addon, always through Bluetooth.

I have setup multiple Samsung routines that somewhat make the experience more usable (autorotate to landscape Waze if car's BT is detected, night/day mode if BT and time of day, auto-tap the screen to allow the screen mirroring permissions), and then I tried the next step:
I got an old Android phone, installed the AAiDrive, and ran Android Head Unit Remote to emulate Android Auto on the old phone, then connected it to the car, streaming AA interface on the iDrive and my s23 connected just to the old phone. This worked, but there is no way to input other than touching the old phone's screen, which is impractical.
Then, I installed an app on my s23 that emulates a Bluetooth keyboard, and coded 5 buttons that are:

  • Left arrow and Right arrow (cycle through current Android Auto active view)
  • Shift + left arrow and Shift + Right arrow (cycle through Android Auto active views and side bar options, changing from Waze to Spotify, or selecting the menu and such)
  • Enter (enter/click on selected item)

This is a tad more convenient, but there's quite some delay in s23 -> old phone BT -> AA interface -> Screen mirror, which makes it really not usable. I thought, and thus the title of my issue, that maybe there would be a way when Screen_mirror is running in the car to map the free knob movements to those key strokes, particularly in the way which 2018-202 Audi MMI interface knob works:

  • Up/down knob: cycle through views (Shift + left arrow and Shift + Right arrow BT key presses)
  • Scroll left/right to navigate selected view (cycle through current Android Auto active view)
  • Press to enter

I know left and right knob are reserved to go back and into side screen options so I didn't include those. I think (talking in the dark here) if there is a way in which you can detect the knob's pressing/tilting/scrolling (or maybe implementing floating/transparent buttons), the action on pressing those buttons could be sending a Keyboard input to the phone (or the straight command, don't know if that's possible in Android).

I have downloaded and tried to fiddle around with the addon's code using your Wiki on the BMW Connected Apps protocol and some GPT's aid, but I have no Kotlin knowledge (Java, Bash and Python guy) and it's been a while since I last actively coded, so my efforts were short of emabarrassing.

Still, I think it might be feasible encoding/making the car send those commands to the phone, or if it's too much a niche request, I'd be thankful for some guidance as to how to implement it, in case there's some good samaritan around who knows Kotlin and Android development and is willing to try and give the idea a go. I had thought it could be added as an "option" to the addon, like a toggle, to avoid people who use the app just as a mirroring to send unwanted commands.

I'd be more than glad to see the full AA working on the car, and even if it's this way, I think it could provide a fair enough close functionality that would be great having. My next steps in trying, were these inputs from the car be possible, would be to get a Pi Zero W with a BT module to run an AA Emulator, plug it to the car's USB and get an almost-seamless AA Experience on a BMW the way BMW never meant it. I'd be more than glad to do that (I have the skills required for it, unlike the one I'd need for this modification I'm requiring :) )

I may not have worded some things correctly, or may have not gone into enough details where those are needed, but I guess the TLDR of the request is: could be implemented some simple knob-to-phone commands as Keyboard inputs to interact with it, or provide some guidance as to how it could be done?

Please, shall any doubts or discussions arise, don't hesitate asking! I'll do my best to try and get this working.
Also, if the idea can be done, I'll be more than happy to document and provide step-by-step details on how to get the "almost AA" experience!! :)

@hufman
Copy link
Collaborator

hufman commented Jun 28, 2024

Good morning, and thank you for the compliments!
Unfortunately, the protocol that sends events from the car to the phone apps is relatively limited. I have not found any touch screen access at all. The protocol defines constants that suggest TILT events might have been possible, but I haven't found any callbacks that use them.
As far as I can tell, the only useful input (used in the map feature) is to have an invisible list that is scrolled through to capture scroll events, along with the normal press event.

Once such an input is received, the next challenge is to send the event to the phone app. As far as I know, the main generic way is to use the accessibility framework to locate and push buttons on the foreground app. However, this use of the accessibility function is not allowed by the Play store.

I made a branch a few years ago to test out this feature, but wasn't happy with the latency and my quick attempt to guess an order to scroll through the on-screen buttons, so I never finished it or merged it. Perhaps you might find it useful to start from!

I look forward to hearing more about your work, and answering any questions that I can :)

@r3dalk
Copy link
Author

r3dalk commented Jun 28, 2024

Hello! Thanks for your quick answer!

I had ruled out touchscreen support based off your comments on other tickets, don't worry.
My proposed approach is quite more... rudimentary, as far as I've come to (kinda) understand the limitations of the platform we're working with.

The inputs I proposed (push knob up, down, scroll left-right and click) were the interactions with the knob I thought/understood could maybe be detected or mapped. As far as I've seen, it's possible to detect the knob press, and by what you mention, the scrolling, so we'd be left with only a method to get the up and down working. My other, and more rudimentary idea, was to draw an overlay of buttons, kinda like the joystick overlay on an emulator, that alows via scroll to reach all the inputs (and/or making it invisible once we're familiar with it), or a sidebar with the controls, in the way that the native maps app works in BMW.

The translation to the phone tho may be easier, since the key presses can be registered as Keyboard inputs, and as far as I know Android can take those, and on a last instance, a new addon could be forked from the screen mirror one that "identifies" itself as a Keyboard app in android, allowing the desired inputs (and maybe any others if it's really possible).

I highly appreciate the branch you've linked, hoping anyone that knows Kotlin and Android development may reach out or comment so we can work on it. (Everyone reading this, please, if you have any experience, please feel free to comment, it's always welcome!)

I don't know if you have experience with the Android Keyboard input interface/integration, but assuming that's done, what would be really left (in the design phase, obviously) would be either capturing those up/down movements of the knob, or maybe adding a sidebar/overlay with the inputs, and it would be kinda feasible?

As a side note, and on a personal, quirk side, I'd like to ask wether the bottom empty space on the screen mirroring view could be used as a line to display some car's info (like current trip avg consumption). It just bugs me off having to scroll right to check on the mini view the avg consumption for current ride, given that BMW refuses to provide it on the HUD.

Thanks again for your time!

@hufman
Copy link
Collaborator

hufman commented Jun 29, 2024

I haven't seen anything in the protocol that I've decoded that suggests how to capture the knob tilt commands :( So as far as I know, the only input is the scrolling and button press, as how the custom map and the test branch implement it.
I'm not sure if implementing a custom keyboard works either, I believe Android only accepts input from a software keyboard while it's open. How does your Bluetooth keyboard app work? I assumed it was one phone acting as a Bluetooth keyboard to another phone.

Technically anything could be drawn in the image that is shown to the car, the custom map is just a normal Android Activity that gets screenshotted and the screen mirror is a bitmap that can be drawn onto (and the screen mirror input branch draws some boxes on top to indicate clickable items). However, that gets into building configurable widgets for car data, which I don't know how to do.

@r3dalk
Copy link
Author

r3dalk commented Jul 2, 2024

Sorry for the late response, it's been a rough weekend and had to pick up the pace on work. I appreciate the time and effor you put onto these requests, hufman.

So we've made some progress, we can (temporarily at least) discard the knob tilt, and opt for the emulator style scroll-and-select mode as far as we can tell.

Regarding the bluetooth keyboard thing, the app just acts as a bluetooth kb, connects to the AA phone and sends the key combinations I stated on the initial post. AFAIK, the same can be done via any app that "identifies" itself as a keyboard, or has keyboard functionality. I've seen a couple of apps do this and it's possible, albeit I'm not entirely sure of how it's implemented. I've read something about Andorid InputMethodService, which could be used to make the Addon show as an input method, then have it running in background so when AA runs it can send the keypresses (just an idea).

Onto the drawing, I understand the concept, but I'm not sure as to how I'd do it since I'm quite lost in Kotlin and Android app developing.

The roadmap for the updated addon would be:

  • Compile alternative version of screen_mirror (idea: screen_mirror_aa) with resized space
  • Design and draw navbar controls under/next to screen mirroring image
  • Desing IME/alternative to make sending keypresses possible
  • Make controls send the identified keypresses to the phone

I'm not sure if you'd know or could help in suggesting anyone who could help me with the low-level coding part, as I'm not quite versed into the Kotlin/Android environment and seems a task quite niche within Android's boundaries...

@hufman hufman transferred this issue from BimmerGestalt/AAIdrive Sep 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants