• 24 Posts
  • 653 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle

  • Smokeydope@lemmy.worldtolinuxmemes@lemmy.worldIt do be like that
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    4 hours ago

    Photoshop cs2 is free to download and probably works well on wine as its old as hell.

    Edit: Correction. You used to o be able to download cs2 from adobe website. This is no longer the case.

    GIMP has always been able to do what I needed more or less. Its got a learning curve and sometimes I still dont 100% how something works but for basic photo editing, meme making, and converting photos to different file types (why is .webp not universally supported yet?) Its pretty good.

    When theres a problem doing something in GIMP it always felt like the issue was my own understanding of the toolset conbined with not-great documentation being given. I never felt like I encountered the limits of the program itself.

    I’m sure if you are a professional graphics person who needs advanced tools theres things only PS can provide and its user interface is probably more friendly. But for me, the average joe schmo, GIMP gets the job done 95% of the time with little to no headache.



  • Yeah, I know better than to get involved in debating someone more interested in spitting out five paragraph essays trying to deconstruct and invalidate others views one by one, than bothering to double check if they’re still talking to the same person.

    I believe you aren’t interested in exchanging ideas and different viewpoints. You want to win an argument and validate that your view is the right one. Sorry, im not that kind of person who enjoys arguing back and forth over the internet or in general. Look elsewhere for a debate opponent to sharpen your rhetoric on.

    I wish you well in life whoever you are but there is no point in us talking. We will just have to see how the future goes in the next 10 years.




  • A tool is a tool. It has no say in how it’s used. AI is no different than the computer software you use browse the internet or do other digital task.

    When its used badly as an outlet for escapism or substitute for social connection it can lead to bad consequences for your personal life.

    When it’s best used is as a tool to help reason through a tough task, or as a step in a creative process. As on demand assistance to aid the disabled. Or to support the neurodivergent and emotionally traumatized to open up to as a non judgemental conversational partner. Or help a super genius rubber duck their novel ideas and work through complex thought processes. It can improve peoples lives for the better if applied to the right use cases.

    Its about how you choose to interact with it in your personal life, and how society, buisnesses and your governing bodies choose to use it in their own processes. And believe me, they will find ways to use it.

    I think comparing llms to computers in 90s is accurate. Right now only nerds, professionals, and industry/business/military see their potential. As the tech gets figured out, utility improves, and llm desktops start getting sold as consumer grade appliances the attitude will change maybe?


  • It delivers on what it promises to do for many people who use LLMs. They can be used for coding assistance, Setting up automated customer support, tutoring, processing documents, structuring lots of complex information, a good generally accurate knowledge on many topics, acting as an editor for your writings, lots more too.

    Its a rapidly advancing pioneer technology like computers were in the 90s so every 6 months to a year is a new breakthrough in over all intelligence or a new ability. Now the new llm models can process images or audio as well as text.

    The problem for openAI is they have serious competitors who will absolutely show up to eat their lunch if they sink as a company. Facebook/Meta with their llama models, Mistral AI with all their models, Alibaba with Qwen. Some other good smaller competiiton too like the openhermes team. All of these big tech companies have open sourced some models so you can tinker and finetune them at home while openai remains closed sourced which is ironic for the company name… Most of these ai companies offer their cloud access to models at very competitive pricing especially mistral.

    The people who say AI is a trendy useless fad don’t know what they are talking about or are upset at AI. I am a part of the local llm community and have been playing around with open models for months pushing my computers hardware to its limits. Its very cool seeing just how smart they really are, what a computer that simulates human thought processes and knows a little bit of everything can actually do to help me in daily life.

    Terrence Tao superstar genius mathematician describes the newest high end model from openAI as improving from a “incompentent graduate” to a “mediocre graduate” which essentially means AI are now generally smarter than the average person in many regards.

    This month several comptetor llm models released which while being much smaller in size compared to openai o-1 somehow beat or equaled that big openai model in many benchmarks.

    Neural networks are here and they are only going to get better. Were in for a wild ride.



  • “Weed lab”! You make the procedure of baking a ground up plant in the oven for 30 minutes then putting it in a crock pot with coconut oil/butter sound like a Breaking Bad meth cooking operation. Jesse, we need to cook some brownies :)

    I respect that processing hemp flower at home isnt your thing. WNC CBD has always been top tier with its thca flower and the edibles from them will almozt certainly kick ass. They know what they’re doing.

    I suggested homemade edibles or tinctures as an economic and effective option. Usually edible users look to pot for frequent pain relief or stress medicine. Buying premade stuff thats actually effective at helping gets pricey quick for medical users.

    Here in USA you can buy legal thca or cbd hemp flower shake right from wholesalers online dirt cheap. Moreover there are specific cooking appliances like the magic butter maker and nova fx too which automates the whole process.

    In case you give it a second thought, the pot smell released during the oven baking process can be mitigated by sealing the flower in a mason jar while cooking in the oven. The jar can easily withstand the 240f temp you decarb the herb at. This also helps recapture active terpenes and cannabanoids that vaporize at low temps.

    Good luck, hope you find some awesome stuff.



  • Its not just AI code but AI stuff in general.

    It boils down to lemmy having a disproportionate amount of leftist liberal arts college student types. Thats just the reality of this platform.

    Those types tend to see AI as a threat to their creative independent business. As well as feeling slighted that their data may have been used to train a model.

    Its understandable why lots of people denounce AI out of fear, spite, or ignorance. Its hard to remain fair and open to new technology when its threatening your livelihood and its early foundations may have scraped your data non-consentually for training.

    So you’ll see AI hate circle jerk post every couple days from angry people who want to poison models and cheer for the idea that its just trendy nonesense. Dont debate them. Dont argue. Just let them vent and move on with your day.


  • Thanks for sharing, knew him from some numberphile vids cool to see they have a mastadon account. Good to know that LLMs are crawling from “incompentent graduate” to “mediocre graduate”. Which basically means its already smarter than most people for many kinds of reasoning task.

    I’m not a big fan of the way the guy speaks though, as is common for super intelligent academic types they have to use overly complicated wording to formally describe even the most basic opinions while mixing in hints of inflated ego and intellectual superiority. He should start experimenting with having o-1 as his editor and summarize his toots.


  • Hey @brucethemoose hope you don’t mind if I ding you one more time. Today I loaded up with qwen 14b and 32b. Yes, 32B (Q3_KS). I didn’t do much testing with 14B but it spoke well and fast. Was more excited to play with the 32B once I found out it would run to be honest. It just barely makes the mark of tolerable speed just under 2T/s (really more like 1.7 with some context loaded in). I really do mean barely, the people who think 5t/s is slow would eat their heart out. However that reasoning and coherence though? Off the charts. I like the way it speaks more than mistral small too. So wow just wow is all I can say. Can’t believe all the good models that came out in such a short time and leaps made in the past two months. Thank you again for recommending qwen don’t think I would have tried the 32B without your input.





  • Thanks for the recommendation. Today I tried out Mistral Small IQ4_XS in combination with running kobold through a headless terminal environment to squeeze out that last bit of vram. With that, the GPU layers offloaded were able to be bumped up from 28 to 34. The token speed went up from 2.7t/s to 3.7t/s which is like a 50% speed increase. I imagine going to Q3 would get things even faster or allow for a bump in context size.

    I appreciate you recommending Qwen too, ill look into it.