On 19/03/2020 20:13, Martin Gregorie wrote:
>
>> Same as SQL. By the time you have taken a day to write the SQL query
>> that does everything you want, only to realise it takes 50 minutes to
>> complete, you could have written most of it in C and got it down to 3
>> seconds...
>>
> Here I DO disagree.
>
> SQL is fine unless you insist on writing huge, do-everything queries. I
> was involved with one of them years ago as part of a benchmarking
> exercise and that pretty much put me off writing that sort of thing for
> good. Never tried SQL procedures either, but concise SQL queries used
> judiciously within logic written in C or Java work very well and are easy
> enough to write and maintain.
>
exactly what I said. Dont use SQL to do complex stuff - its very hard to
get the syntax right and it runs like a dog with three legs ampurated.
> Most SQL performance problems, IME anyway, boil down to crap database
> design, meaning bad or nonexistent normalisation and incorrectly placed
> or missing indexes.
No. In the case where I did the biggest job- it was normalising a flat
database of a few million UK postcodes into a relational one - none of
these were the problem.
What was the problem was Mysqls inability to create good optimised
machine code out of SQL statements. Unlike - say - moderb C compilers
which astound me in their ability to write better assembler than I could
myself, MySQL is like going back to the first 8 bit C compilers I used.
But, given that a relational database has a decent,
> user-friendly query analyser and there's enough realistic test data its
> generally quite simple to get the speed up to where it should be.
>
On simple queries, yes, bit not on complex ones involving conditional
selections of selections etc.
> Of course, if the DBA can't normalise and doesn't understand an ERD,
> and if the system designers only provide small amounts of largely
> imagined data rather than a few hundred or thousand actual business data
> items, then OF COURSE the database performance will be crap.
>
> Don't ask me how I know that: I've been called on too many times to fix
> that sort of mess. But sometimes the clients got it right. In one project
> it was very nice indeed to be given half a million records of valid test
> data. That was for the last major DB I worked on: I did much of the
> design and then tuned it using that huge pile of actual data. It really
> sang from the off.
>
When I had fished what I wanted ran well, with over a million records
but it did not use complex queries.
Creating it from the data I started with would have, if I hadn't given
up trying to do the whole job with SQL and restricted myself to simple
queries, building enormous linked lists in C - over a gigabyte in size -
and thinking hard about how I would access the contents.
--
“People believe certain stories because everyone important tells them,
and people tell those stories because everyone important believes them.
Indeed, when a conventional wisdom is at its fullest strength, one’s
agreement with that conventional wisdom becomes almost a litmus test of
one’s suitability to be taken seriously.”
Paul Krugman
--- SoupGate-Win32 v1.05
* Origin: Agency HUB, Dunedin - New Zealand | FidoUsenet Gateway (3:770/3)
|