| Commit message (Collapse) | Author | Age |
|
|
|
| |
As this query can take some time.
|
|
|
|
| |
And include the chunk size in the log message.
|
|
|
|
|
| |
This is so that data.qa.guix.gnu.org can be configured only to query the
branches from the main repository.
|
| |
|
|
|
|
| |
To better understand the memory usage when this is happening.
|
|
|
|
| |
Since larger chunks still ran in to inferior memory usage problems.
|
|
|
|
|
|
|
|
| |
In an attempt to reduce the peak memory usage, and avoid running in to the:
Too many heap sections: Increase MAXHINCR or MAX_HEAP_SECTS
issue.
|
| |
|
|
|
|
| |
As this is a little clearer.
|
| |
|
|
|
|
|
| |
Drop the batch size to get rid of warnings about memory usage and improve the
logging by adding duration information.
|
|
|
|
|
|
| |
The build event information can now contain the derivation outputs, as well as
the name of the derivation. This allows the Guix Data Service to join builds
up with derivations, even if it doesn't know about the derivation being built.
|
|
|
|
| |
To ensure that direct array comparison can be used in the query.
|
|
|
|
| |
So that this can be used when inserting builds.
|
|
|
|
| |
So that this can be done when inserting builds.
|
|
|
|
|
|
|
|
|
| |
The server part of the guix-data-service doesn't work great as a guix service,
since it often fails to start if the migrations take any time at all.
To address this, start the server before running the migrations, and serve the
pages that work without the database, plus a general 503 response. Once the
migrations have completed, switch to the normal behaviour.
|
|
|
|
| |
And also remove the duplicates that have crept in.
|
|
|
|
| |
This helps render the package version range related pages.
|
| |
|
| |
|
| |
|
| |
|
|
|
|
| |
I think it was broken with the git_branches/git_commits switch.
|
|
|
|
| |
This matches the previous behaviour without using the platform data.
|
|
|
|
|
| |
This means there's less reliance on the hardcoded lists of systems and targets
and mappings between them.
|
| |
|
| |
|
|
|
|
| |
For the table schema change.
|
| |
|
| |
|
|
|
|
|
|
|
|
| |
This means you can query for derivations where builds exist or don't exist on
a given build server.
I think this will come in useful when submitting builds from a Guix Data
Service instance.
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
| |
And create a proper git_branches table in the process.
I'm hoping this will help with slow deletions from the
package_derivations_by_guix_revision_range table in the case where there are
lots of branches, since it'll separate the data for one branch from another.
These migrations will remove the existing data, so
rebuild-package-derivations-table will currently need manually running to
regenerate it.
|
|
|
|
| |
Thanks to Tobias for reporting.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
| |
As I'm seeing the inferior process crash with [1] just after fetching the
derivation lint warnings.
This change appears to help, although it's probably just a workaround. When
there's more packages/derivations, the caches might need clearing while
fetching the derivation lint warnings, or this will need to be split across
multiple processes.
1: Too many heap sections: Increase MAXHINCR or MAX_HEAP_SECTS
|
|
|
|
|
| |
These cached store connections have caches associated with them, that take up
lots of memory, leading to the inferior crashing. This change seems to help.
|
|
|
|
|
|
|
|
|
|
| |
To the end of the main revision processing transaction.
Currently, I think there are issues when this query does update some builds,
as those rows in the build table remain locked until the end of the
transaction. This then causes build event submission to hang. Moving this part
of the revision loading process to the end of the transaction should help to
mitigate this.
|
|
|
|
|
| |
When there's a target, render the heading neatly, and include the target
parameter in the URLs.
|
| |
|
|
|
|
|
| |
This means that the lock can be acquired after closing the inferior, freeing
the large amount of memory that the inferior process is probably using.
|
|
|
|
| |
Since the hardcoded list in the load-new-guix-revision code has been updated.
|
| |
|
|
|
|
| |
As the cross targets take quite some time.
|
|
|
|
| |
This might help reduce memory usage a little.
|
| |
|
|
|
|
|
|
|
|
|
| |
Previously, duplicates could creep through if the duplicate wasn't exported,
and only found as a replacement. Now they're filtered out.
This isn't ideal, as duplicates aren't always mistakes, it would be useful
still to capture this package, but having multiple entries for the same
name+version causes the comparison functionality to break.
|
|
|
|
|
| |
Use the a-version and b-version variables, rather than calling the functions
again.
|