This would be nice. Today, in my company, we have about 30000 tickets per month and when we need to do some reports MySQL performance is a bottleneck and we need to run many of the queries overnight in order not to impact operation. Other technic we use is to pre calculate and pre process some data in real time using triggers.
Using PostgreeSQL would help a lot.
Discussion about postgresql & sqllite support here : https://github.com/glpi-project/glpi/issues/942#issuecomment-250897440
That discussion is not really about pgsql as such, but more about using a standardised tool which can talk to any database and because it's intended to talk to simple databases, it cripples available queries by disallowing JOIN or subqueries.
I really don't get why people are afraid of pgSQL. MySQL is ok for small jobs but it has a limited range of queries and data types, plus its clustering is badly broken in ways which make it look OK until it falls over unexpectedly (as our devs found out the hard way).
OK, I do get why people are afraid. Yes it uses a bit more memory when you install it, but the days of 64Mb machines are a long way in the past. After having used MySQL for 20 years and then learned to use pgSQL, everything is minor. pgSQL is stricter about some queries but our experience has been "you should have written them better in the first place" - and when fixed they ran faster on the mysql boxes too.
When I converted my Bacula database to pgSQL (50 million entries) it dropped ram requirements by over 60% - 40Gb down to about 7. Queries run 10 times faster and database dumps run 3 times faster.
MySQL simply doesn't scale. It's good for small, simple databases but GLPI isn't small or simple.
Everything is about contribution. We need more time & people to work on the subject.
Yes we know that PostgreSQL is good, and yes we know that we want to support it.
We just don't have right now the time & money needed for such a big change.
If you're willing to help, there's a lot of different ways to do it. Feel free to join us on the mailing list, IRC or Telegram to discuss that.
Adding PostgreSQL support is a very, very huge job.... But we've already made some changes in order to achieve this, like:
- dropping all raw SQL queries (this is not yet finished but a lot of job has been done already),
- switching to PHP PDO (will be possible only in a major release, since this would break many things), etcI've posted threads about that on the dev mailing list:
See https://github.com/glpi-project/glpi/pull/5968 (this is still a WIP, and there is no planned date).
I realize this is no small feature request so I would like to outline why running MySQL besides for example an existing PostgreSQL installation is no small feat.
For a DB installation you need things like; a high availability setup, monitoring and alerting setup (with useful alerts like replication lag exceeding thresholds, etc), a disaster recovery plan (i.e. backup, preferably a combination of full backups combined with streaming logs to minimize dataloss. You need to encrypt these backups for compliance, ensure you have a plan on how to backup the keys for this encryption, etc.), monitoring of this backup solution, regularly testing this disaster recovery plan and I'm probably leaving out a whole bunch of stuff which you also need.
So speaking from personal experience I have such a setup for PostgreSQL. Configuring and maintaining such a stack for MySQL for just GLPI is just not feasible.
From a technical perspective I would say that both MySQL and PostgreSQL support enough of the more recent ANSI SQL standards that all the required SQL functionality should be there to stay compatible between the 2. I also realize there will also always be some corner cases which require specific queries for MySQL and PostgreSQL. Though I think 99% of the queries could be compatible between the two.
Service d'assistance aux clients par UserEcho