AI & ML interests

None defined yet.

Recent Activity

Jeronymous  updated a dataset 3 days ago
OpenLLM-France/wikipedia
Jeronymous  updated a dataset 3 days ago
OpenLLM-France/wiktionary
Jeronymous  updated a dataset 3 days ago
OpenLLM-France/wikisource
View all activity

OpenLLM-France's activity

prithivMLmods 
posted an update about 17 hours ago
view post
Post
1651
Reasoning SmolLM2 🚀

🎯Fine-tuning SmolLM2 on a lightweight synthetic reasoning dataset for reasoning-specific tasks. Future updates will focus on lightweight, blazing-fast reasoning models. Until then, check out the blog for fine-tuning details.

🔥Blog : https://huggingface.co/blog/prithivMLmods/smollm2-ft

🔼 Models :
+ SmolLM2-CoT-360M : prithivMLmods/SmolLM2-CoT-360M
+ Reasoning-SmolLM2-135M : prithivMLmods/Reasoning-SmolLM2-135M
+ SmolLM2-CoT-360M-GGUF : prithivMLmods/SmolLM2-CoT-360M-GGUF

🤠 Other Details :
+ Demo : prithivMLmods/SmolLM2-CoT-360M
+ Fine-tune nB : prithivMLmods/SmolLM2-CoT-360M




AkimfromParis 
posted an update 3 days ago
view post
Post
1529
💵 Polymarket is leveraging “Chatbot Arena LLM Leaderboard” on HuggingFace for online gambling on the “Top AI model on January 31?”. 🤗

As of January 3rd, 2025:
-1./ Gemini (83%) -2./ ChatGPT (13%) -3./ Other (2%) -4./ Claude (2%) -5./ Grok (1%) -6./ Llama (<1%)

🇺🇸 The market opinion is following historical data. It's clearly bias towards US historical AI giants, yet Polymarket is forbidden in the USA and for US citizens.

🇨🇳 In the “Other”, you might have Chinese AI labs that are probably the future AI leaders (Qwen, DeepSeek, Yi).

⚖️ In the market resolution, if two models are tied in the evaluation, they will take the alphabetical order. (e.g. if both were tied, “Google” would resolve to “Yes”, and “xAI” would resolve to “No”). 🙃

That might be illegal usage of the Chatbot Arena policy? And maybe HuggingFace? @clem
Or maybe authors and contributors should get a cut each month as “market markers”.  @weichiang @angelopoulos
prithivMLmods 
posted an update 6 days ago
view post
Post
3558
Triangulum Catalogued 🔥💫

🎯Triangulum is a collection of pretrained and instruction-tuned generative models, designed for multilingual applications. These models are trained using synthetic datasets based on long chains of thought, enabling them to perform complex reasoning tasks effectively.

+ Triangulum-10B : prithivMLmods/Triangulum-10B
+ Quants : prithivMLmods/Triangulum-10B-GGUF

+ Triangulum-5B : prithivMLmods/Triangulum-5B
+ Quants : prithivMLmods/Triangulum-5B-GGUF

+ Triangulum-1B : prithivMLmods/Triangulum-1B
+ Quants : prithivMLmods/Triangulum-1B-GGUF
·
1aurent 
posted an update 6 days ago
AkimfromParis 
posted an update 11 days ago
view post
Post
1643
🇺🇸 🇨🇦 🇬🇧 Nobel Prize winners against USSR & Japanese AI pioneers ☭🇯🇵

🇩🇪 Prof. Jürgen Schmidhuber:  “The #NobelPrize in Physics 2024 for Hopfield & Hinton turns out to be a Nobel Prize for plagiarism. They republished methodologies developed in #Ukraine and #Japan by Ivakhnenko and Amari in the 1960s & 1970s, as well as other techniques, without citing the original inventors.”

1965 - First Deep Learning - USSR ☭ (Ukraine 🇺🇦 now)
Ivakhnenko and Lapa introduced the first deep learning in deep MLPs that learn internal representations of input data.

1967/68 - Deep Learning by Stochastic Gradient Descent - Japan 🇯🇵
Shun-Ichi Amari trained MLPs with many layers in non-incremental end-to-end fashion from scratch by stochastic gradient descent (SGD).

1969 - Rectified linear unit - Japan 🇯🇵
In 1969, Kunihiko Fukushima introduced ReLU in the context of visual feature extraction in hierarchical neural networks.

1970 - Backpropagation - Finland 🇫🇮 😃
In 1970, Seppo Linnainmaa was the first the reverse mode of automatic differentiation, now known as backpropagation.

1972 - Recurrent Neural Network - Japan 🇯🇵
In 1972, Shun-Ichi Amari published a learning recurrent neural network based on Lenz-Ising model (Amari's net was later called the "Hopfield network". Hopfield republished in 1982, without citing Amari papers.)

1979 - First Convolutional neural network - Japan 🇯🇵
CNN architecture was introduced in 1979 by Kunihiko Fukushima, also known as Neocognitron.

https://people.idsia.ch/~juergen/deep-learning-history.html#AMH2
  • 11 replies
·
prithivMLmods 
posted an update 15 days ago
prithivMLmods 
posted an update 18 days ago
view post
Post
2494
Qwen2VL Models: Vision and Language Processing 🍉

📍FT; [ Latex OCR, Math Parsing, Text Analogy OCRTest ]

Colab Demo: prithivMLmods/Qwen2-VL-OCR-2B-Instruct

❄️Demo : prithivMLmods/Qwen2-VL-2B . The demo includes the Qwen2VL 2B Base Model.

🎯The space handles documenting content from the input image along with standardized plain text. It includes adjustment tools with over 30 font styles, file formatting support for PDF and DOCX, textual alignments, font size adjustments, and line spacing modifications.

📄PDFs are rendered using the ReportLab software library toolkit.

🧵Models :
+ prithivMLmods/Qwen2-VL-OCR-2B-Instruct
+ prithivMLmods/Qwen2-VL-Ocrtest-2B-Instruct
+ prithivMLmods/Qwen2-VL-Math-Prase-2B-Instruct

🚀Sample Document :
+ https://drive.google.com/file/d/1Hfqqzq4Xc-3eTjbz-jcQY84V5E1YM71E/view?usp=sharing

📦Collection :
+ prithivMLmods/vision-language-models-67639f790e806e1f9799979f

.
.
.
@prithivMLmods 🤗
  • 1 reply
·
prithivMLmods 
posted an update 19 days ago
view post
Post
3243
🎄 Here Before - Xmas🎅✨

🧑🏻‍🎄Models
+ [ Xmas 2D Illustration ] : strangerzonehf/Flux-Xmas-Illustration-LoRA
+ [ Xmas 3D Art ] : strangerzonehf/Flux-Xmas-3D-LoRA
+ [ Xmas Chocolate ] : strangerzonehf/Flux-Xmas-Chocolate-LoRA
+ [ Xmas Isometric Kit ] : strangerzonehf/Flux-Xmas-Isometric-Kit-LoRA
+ [ Xmas Realpix ] : strangerzonehf/Flux-Xmas-Realpix-LoRA
+ [ Xmas Anime ] : strangerzonehf/Flux-Anime-Xmas-LoRA

❄️Collections
+ [ Xmas Art ] : strangerzonehf/christmas-pack-6758b199487adafaddb68f82
+ [ Stranger Zone Collection ] : prithivMLmods/stranger-zone-collections-org-6737118adcf2cb40d66d0c7e

🥶Page
+ [ Stranger Zone ] : https://huggingface.co/strangerzonehf


.
.
.
@prithivMLmods 🤗
prithivMLmods 
posted an update 24 days ago
prithivMLmods 
posted an update about 1 month ago
view post
Post
3832
Near 3:2 { 1280*832 } Adapters 🔥

🧪The datasets were prepared for a 3:2 aspect ratio by processing images of any dimension (width × height) in alignment with the adapter's concept. This involved using techniques such as magic expand, magic fill, or outpainting to adjust the remaining parts of the image to achieve the 3:2 ratio & posts training. This approach enhanced the desired image quality to up to 2 MB for detailed prompts and reduced artifacts in images sized at 1280 × 832.

🎈This approach was used instead of cropping down the 2x or 3x zoomed positions in the actual image. It generative filling to adjust the image's aspect ratio proportionally within the dataset.

🔧I used Canva's Magic Expand, Firefly's Generative Fill, and Flux's Outpaint for aspect ratio adjustments.

⬇️Model DLC :
+ [ Microworld Nft ] : strangerzonehf/Flux-Microworld-NFT-LoRA
+ [ Creative Stocks ] : strangerzonehf/Flux-Creative-Stocks-LoRA
+ [ Icon-Kit ] : strangerzonehf/Flux-Icon-Kit-LoRA
+ [ Claymation ] : strangerzonehf/Flux-Claymation-XC-LoRA
+ [ Super Portrait ] : strangerzonehf/Flux-Super-Portrait-LoRA
+ [ Ghibli Art ] : strangerzonehf/Flux-Ghibli-Art-LoRA
+ [ Isometric Site ] : strangerzonehf/Flux-Isometric-Site-LoRA

🧨Page :
1] Stranger Zone: https://huggingface.co/strangerzonehf

💣Space :
1] Flux LoRA DLC: prithivMLmods/FLUX-LoRA-DLC

📦Collections :
1] strangerzonehf/flux-3dxl-engine-674833c14a001d5b1fdb5139
2] prithivMLmods/flux-lora-collections-66dd5908be2206cfaa8519be
3] strangerzonehf/animaker-engine-673714956dec98c400c30cf6
4] strangerzonehf/mixer-engine-673582c9c5939d8aa5bf9533

.
.
.
@prithivMLmods
  • 1 reply
·
prithivMLmods 
posted an update about 1 month ago
view post
Post
2640
Milestone for Flux.1 Dev 🔥

💢The Flux.1 Dev model has crossed 1️⃣0️⃣,0️⃣0️⃣0️⃣ creative public adapters! 🎈
🔗 https://huggingface.co/models?other=base_model:adapter:black-forest-labs/FLUX.1-dev

💢This includes:
- 266 Finetunes
- 19 Quants
- 4 Merges

💢 Here’s the 10,000th public adapter : 😜
+ strangerzonehf/Flux-3DXL-Partfile-0006

💢 Page :
+ https://huggingface.co/strangerzonehf

💢 Collection :
+ prithivMLmods/flux-lora-collections-66dd5908be2206cfaa8519be
ZennyKenny 
posted an update about 1 month ago
prithivMLmods 
posted an update about 1 month ago
view post
Post
2736
Fine-Textured [Polygon] Character 3D Design Renders 🙉

Adapters capable of providing better lighting control (Bn+, Bn-) and richer textures compared to previous sets require more contextual prompts for optimal performance.

The ideal settings are achieved at inference steps around 30–35, with the best dimensions being 1280 x 832 [ 3:2 ]. However, it also performs well with the default settings of 1024 x 1024 [ 1:1 ].

💢Models DLC :
+ strangerzonehf/Flux-3DXL-Partfile-0001
+ strangerzonehf/Flux-3DXL-Partfile-0002
+ strangerzonehf/Flux-3DXL-Partfile-0003
+ strangerzonehf/Flux-3DXL-Partfile-0004
+ strangerzonehf/Flux-3DXL-Partfile-C0001

💢Collections :
1] strangerzonehf/flux-3dxl-engine-674833c14a001d5b1fdb5139
2] prithivMLmods/flux-lora-collections-66dd5908be2206cfaa8519be

💢Space :
1] prithivMLmods/FLUX-LoRA-DLC

💢Page :
1] Stranger Zone: https://huggingface.co/strangerzonehf

.
.
.
@prithivMLmods 🤗
prithivMLmods 
posted an update about 1 month ago
view post
Post
3286
HF Posts Receipts 🏆🚀

[ HF POSTS RECEIPT ] : prithivMLmods/HF-POSTS-RECEIPT

🥠The one thing that needs to be remembered is the 'username'.

🥠And yeah, thank you, @maxiw , for creating the awesome dataset and sharing them here! 🙌

🥠[ Dataset ] : maxiw/hf-posts

.
.
.
@prithivMLmods
ZennyKenny 
posted an update about 1 month ago
view post
Post
1211
I've joined the Bluesky community. Interested to see what decentralized social media looks like in action: https://bsky.app/profile/kghamilton.bsky.social

Looking forward to following other AI builders, tech enthusiasts, goth doomscrollers, and ironic meme creators.