LLaMA 🦙 is out now

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,893
Metas Large language model announced

“Meta also said that it will make its models available to the research public and is taking applications from researchers. The underlying models for Google's LaMDA and OpenAI's ChatGPT are not public.

"Meta is committed to this open model of research and we'll make our new model available to the AI research community," Zuckerberg wrote.”

1677277567048.png


Source: https://www.cnbc.com/2023/02/24/mark-zuckerberg-announces-meta-llama-large-language-model.html
 
Yawn...

This fad shall also pass.
Competition is good though, and do like the openness of metas model and have dabbled with their previous offerings under BlenderBot so this is refreshing actually

1677277986333.png
 
I want to see a smaller version of these trained on video game environment, that can fit locally on my pc, so that I can chat with NPCs.
 
Me three. I still use Winamp v2.91 to Whip the Llama's ass on my Win10 machines.

As to the subject at hand. I am highly dubious about the ever increasing reliance on AI. Especially in the medical field. Contextual human-centric algorithms or otherwise.
 
1. Someone will attempt to consummate intimate relations with it.
2. It will be used to cheat in an online game / college.
3. Can make funny responses to philosophical questions.
 
That's really nice of Meta to do that but I am afraid they are going to get their asses whipped by Nullsofts Weighted Intelligence Network for Advanced Machine Processing
It looks to have a superior stance on the market.

 
That's really nice of Meta to do that but I am afraid they are going to get their asses whipped by Nullsofts Weighted Intelligence Network for Advanced Machine Processing
It looks to have a superior stance on the market.


Regarding Winamp mp3 player, not even Winamp is safe from AI improvements and augmentations,

"The MP3 format uses the Fast Fourier Transform (FFT) algorithm to compress audio data. FFT is a widely used algorithm for signal processing that converts a signal from the time domain into the frequency domain, allowing for efficient compression by removing redundant or unnecessary frequency components."

800px-Blockmp3.png


"
to make FFT even faster for MP3 encoding. Here are some potential approaches:

  1. Neural Networks for FFT: One approach is to use neural networks to replace traditional FFT algorithms. For example, some researchers have proposed using deep learning models to learn the FFT function, which could potentially be faster and more accurate than traditional FFT algorithms.
  2. Hardware Optimization with AI: AI techniques such as reinforcement learning can be used to optimize hardware designs for FFT processing. This can result in hardware that is specifically tailored for FFT calculations, resulting in faster processing times.
  3. Predictive Encoding: AI and ML techniques can be used to predict the optimal parameters for MP3 encoding based on the input audio signal. This can result in faster and more efficient encoding by reducing the need for trial and error in determining optimal encoding parameters.
  4. Auto-Tuning: Auto-tuning techniques can be used to automatically adjust the parameters of the FFT algorithm based on the input signal. This can result in faster and more efficient FFT processing.
  5. Parallel Processing Optimization: AI techniques can be used to optimize the distribution of FFT processing across multiple processing units, such as CPUs, GPUs, or FPGAs. This can result in more efficient use of processing resources, resulting in faster FFT processing times.
Overall, AI, ML, and DL can be applied to make FFT even faster for MP3 encoding by optimizing hardware designs, predicting optimal encoding parameters, and improving parallel processing. While some of these techniques are still in the research phase, they have the potential to significantly improve the speed and efficiency of MP3 encoding."

Lakados

1677290051115.png


1677290154813.png
 
Increasing FFT efficiencies is huge.

Increasing it for MP3 encoding is a zero burger.
 
  • Like
Reactions: erek
like this
Increasing FFT efficiencies is huge.
Apparently there’s been some high profile missteps related to some competing theoretical quantum computing approaches so this notion is a bit unrealized at the moment, unfortunately. Several a retracted academic papers from high profile scientific journals and the lot.

“So-called topological quantum computing would avoid many of the problems that stand in the way of full-scale quantum computers. But high-profile missteps have led some experts to question whether the field is fooling itself.”

https://www.quantamagazine.org/major-quantum-computing-strategy-suffers-serious-setbacks-20210929/

1677291827320.png
 
Back
Top