this post was submitted on 01 Sep 2023
22 points (100.0% liked)

Free and Open Source Software

17746 readers
95 users here now

If it's free and open source and it's also software, it can be discussed here. Subcommunity of Technology.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Does anybody know about a good voice-to-text tool? I improvise song lyrics a lot and often thought about how useful it would be to have it written down so I can salvage the good parts.

all 7 comments
sorted by: hot top controversial new old
[–] TheHobbyist@lemmy.zip 13 points 1 year ago (2 children)

OpenAI's Whisper model is a really great one, supports many languages ans translationa and is available both as a pretrained model (https://github.com/openai/whisper) which can be selfhosted and as an API (https://openai.com/pricing).

[–] doodimus@beehaw.org 1 points 1 year ago (1 children)

I just found out this was foss yesterday, and really want to start playing with it on my system. Is it GPU-agnostic? I have an AMD card and don't see any mention of GPU support or CUDA requirement on the github docs.

[–] TheHobbyist@lemmy.zip 2 points 1 year ago (2 children)

AMD has ROCm which is available on And Radeon Instinct GPUs (server GPUs) and some consumer GPUs. You'd need to double check whether your GPU supports ROCm.

It seems there is some discussion happening here on the use of ROCm with Whisper: https://github.com/openai/whisper/discussions/105 And here (showing it might be possible?): https://github.com/openai/whisper/discussions/55

[–] TheHobbyist@lemmy.zip 2 points 1 year ago

I also found this which could be of interest:

MLC-LLM, which "Enable everyone to develop, optimize and deploy AI models natively on everyone's devices."

Here used to deploy Llama-2-13B on the RX 7900 XTX:

https://blog.mlc.ai/2023/08/09/Making-AMD-GPUs-competitive-for-LLM-inference?ref=upstract.com

[–] doodimus@beehaw.org 2 points 1 year ago

Thanks for that, I've been able to get Stable Diffusion running locally with ROCm so it looks like it should be possible then.

[–] realslef@fedia.io 1 points 1 year ago

Not great yet and not even good for my accent, but dicio is the only foss one I've got working on android so far.