Here is the thing, I couldn’t find a girlfriend in the past three years(either they have no personality and too dependent on me making all the work in the relationship or they are simply are looking for a religious or high class working person to have a relationship with)
AI seems to talk and entertain and simply are available without bullshit (the only limit is something along the lines of 50 free messages every 3 hours) .
I want physical intimacy and affection, but it’s simply not available.
Should I just give up, close my doors and use AI instead?
I agree with what you said. The only thing I want to point out is with your statement:
Running models locally doesn’t necessarily mean that it’s better than the environment. Usually the hardware at cloud data centers is far more efficient at running intense processes like LLMs than your average home setup.
You would have to factor in whether your electricity provider is using green energy (or if you have solar) or not. And then you would also have to factor in whether you’re choosing to use a green data center (or a company that uses sustainable data centers) to run the model.
That being said, (in line with what you stated before) given the sensitive nature of the conversations this individual will be having with the LLM, a locally run option (or at least renting out a server from a green data center) is definitely the recommended option.
About the enviroment points you are making: i was more thinking that a small model gets used on already existing hardware (which is the friendly part) and somehow overlooked the efficiency of having a big block for one purpose
So youre right