MP> I am having a discussion in DoveNet TechTalk where the other party
MP> brought up how AI has sort of been used for years in their field. It
MP> provided various mathematical and data point information that was
MP> accurate. It is when they try to get AI to do things that require it to
MP> "think like a human," rather than a computer, is where they are getting
MP> into trouble.
Into trouble with electric consumption? (Or into trouble with the programming?)
MP> >I understand that data centers can consume a lot, but I'm still in the da
MP> >about how AI demands more electric than normal webservers.
MP>
MP> Webservers comsume electric for sure, but they don't do a lot of
MP> processor intense work. The processors in an AI datacenter are going to
MP> be different -- think more "heavy duty" -- and the machines are going to
MP> pull a lot more power than a machine that serves up web pages.
I appreciate your explanation. I just don't understand what constitutes "AI" though, and I don't understand why they would need (for example) more cores, more RAM, or more bandwidth than an average webserver would.
The phrase "AI" is ambiguous. I did a Google search of "Which language is AI written in" and it said "Python," but is that accurate? It seems like there are better programming languages to use for tasks like emulating human thoughts.
MP> For industrial plants, the power companies are probably taking the extra
MP> draw into account. For a "data center," they may not realize how much
MP> power the owner is actually planning to consume so they may treat it like
MP> any other office facility.
I get what you're saying, and it's a good guess, but to get to the bottom of this I'll have to (someday) learn why AI programs are more resource dependent than others.
--- Mystic BBS v1.12 A48 (Linux/64)
* Origin: JoesBBS.com, Telnet:23 SSH:22 HTTP:80 (1:342/202)
|