SBT Snapshot Resolution - sbt

I'm used to using Maven where SNAPSHOT resolution is cached and only invalidated every X amount of time (or forcibly with -U). With SBT, it resolves my snapshot every time which is painfully slow. Is there a way to tell SBT not to perform snapshot resolution every compile?

See sbt 0.13.7 and newer's Cached Resolution feature for more detail.
In your *.sbt project definition file add:
updateOptions := updateOptions.value.withCachedResolution(true)
Edit:
As per #samuel's comment below, the above setting does not (yet) affect snapshot dependencies as they are marked a as "changing" and always updated anyway. A workaround is to use sbt in offline mode after having resolved and downloaded all dependencies.
For example, in the sbt shell set offline := true.

Related

Github Action failing with R CMD check, using old commit?

I'm not sure how best to describe this, hence the rather vague title.
I have an R package that uses Github Actions to run checks. You can see the workflow file here:
https://github.com/Azure/Microsoft365R/blob/master/.github/workflows/check-standard.yaml
It's basically the same as the check-standard workflow in the r-lib/actions repo, with some tweaks for my particular requirements. My latest commit is failing the check for the MacOS build, with this error:
Run remotes::install_deps(dependencies = TRUE)
Error: Error: HTTP error 404.
Not Found
Did you spell the repo owner (`hongooi73`) and repo name (`AzureGraph`) correctly?
- If spelling is correct, check that you have the required permissions to access the repo.
Execution halted
Error: Process completed with exit code 1.
The step in question is this. It just scans the package's DESCRIPTION file and installs the dependencies for the package -- all very straightforward.
- name: Install dependencies
run: |
remotes::install_deps(dependencies = TRUE)
remotes::install_cran(c("pkgbuild", "rcmdcheck", "drat"))
shell: Rscript {0}
It looks like it's trying to install a dependency from the hongooi73/AzureGraph repo, which no longer exists. But my DESCRIPTION file doesn't list hongooi73/AzureGraph as a remote dependency; it uses Azure/AzureGraph of which hongooi73/AzureGraph was a fork. It used to refer to hongooi73/AzureGraph, but that was several commits ago. Indeed, the Linux and Windows checks both run without problems so they are clearly using the correct repo location.
What can be causing this failure? And how do I fix it? I've already tried rerunning the workflow, and deleting older workflows.
You're using actions/cache to cache your R libs. By this you're restoring a cache that might be invalid if your key and the restore-keys isn't set up properly.
At the moment, there is no direct way to manually clear the cache. For some other options you can check Clear cache in GitHub Actions.
Jan. 2021:
At the moment, there is no direct way to manually clear the cache
June 2022: Actually, there now is:
List and delete caches in your Actions workflows
You can now get more transparency and control over dependency caching in your actions workflows.
Actions users who use actions/cache to make jobs faster on GitHub Actions can now use our cache list and delete APIs to:
list all the Actions caches within a repository and sort by specific metadata like cache size, creation time or last accessed time.
delete a corrupt or a stale cache entry by providing the cache key or ID.
Learn more about Managing caching dependencies to speed up workflows.
See the updated answer to "Clear cache in GitHub Actions" from beatngu13 for the GitHub API call examples.

Can Duplicati preserve file dates and times?

This is a continuation of: Duplicati and Backup of live Pervasive Database Missing Data
We have attempted to restore a database data directory. What we are expecting to see in the backup is an exact mirror of what was backed up the night prior.
We are still not seeing all of the data in the restore directory. As far as we can tell, Duplicati seems to be using either the modified date, and or file size of each file when determining what files to back up. Can someone please confirm this one way or the other?
Is there a way to have Duplicati take a backup of only files whose metadata has changed, instead of using the file date and or file size?
Also, on the completion of every restore, there is a modal box that says "8500 Warnings" but we can't see all of them in the log.
What we can see in the log is:
MetadataWriteFailed
Failed to apply metadata to file
EDIT:
We uninstalled Duplicati Beta and installed Canary in its place. What we see now, is all of our data. All of the rows are being backed up, whereas in Beta, they are not; we are missing rows of data.
One other thing that we noticed was that when the Beta version restores, all of the date/time values for every file are set to the date and time of the restore. With Canary, all of the date/time values are preserved.
Using Canary, we no longer see the warning "MetadataWriteFailed Failed to apply metadata to file"
Is this intended behavior between both versions?
Is this intended behavior between both versions?
The canary build has a lot of fixes. I do not recall what was changed with the metadata restore, but if the metadata restore fails (as you see in the beta) that would leave the files with the restore date.
There should not be any changes as to what files are being backed up, so I am not sure what you mean with "we are missing rows of data".

RPM Remote Repository - Package does not match intended download

We're making use of a remote repository and are storing artifacts locally. However, we are running into a problem because of the fact the remote repository regularly rebuilds all artifacts that it hosts. In our current state, we update metadata (e.x. repodata/repomd.xml), but artifacts are not updated.
We have to continually clear our local remote-repository-cache out in order to allow it to download the rebuilt artifacts.
Is there any way we can configure artifactory to allow it to recache new artifacts as well as the new artifact metadata?
In our current state, the error we regularly run into is
https://artifactory/artifactory/remote-repo/some/path/package.rpm:
[Errno -1] Package does not match intended download.
Suggestion: run yum --enablerepo=artifactory-newrelic_infra-agent clean metadata
Unfortunately, there is no good answer to that. Artifacts under a version should be immutable; it's dependency management 101.
I'd put as much effort as possible to convince the team producing the artifacts to stop overriding versions. It's true that it might be sometimes cumbersome to change versions of dependencies in metadata, but there are ways around it (like resolving the latest patch during development, as supported in the semver spec), and in any way, that's not a good excuse.
If that's not possible, I'd look into enabling direct repository-to-client streaming (i.e. disabling artifact caching) to prevent the problem of stale artifacts.
Another solution might be cleaning up the cache using a user plugin or a script using JFrog CLI once you learn about newer artifacts being published in the remote repository.

Julia: Recompiling stale cache file?

The following message appears everytime I attempt to use 'Gadfly', 'Bio', or several other packages (I'm using 'Bio' in the example):
julia> using Bio
INFO: Recompiling stale cache file C:\Users\CaitlinG\emacs251\.julia\lib\v0.5\Di
stributions.ji for module Distributions.
INFO: Recompiling stale cache file C:\Users\CaitlinG\emacs251\.julia\lib\v0.5\Bi
o.ji for module Bio.
Julia 0.5.1 (all packages updated)
Windows 10 (fully updated)
Emacs 25.1
This is inconvenient since I can only assume it is not a "typical" component of importing a package. Can the issue be resolved by deleting the .julia directory?
Thanks.
Moving my comment to an answer since that appears to have resolved the question:
Julia caches its precompiled output within the .julia/lib folder. If any of the files there are older than the original source, it'll recompile them. It seems like Julia was having trouble overwriting the cache for a few specific packages here, so it was persistently recompiling them. By deleting the lib folder, you clear out these caches. Julia will recompile all packages, but it should now write them with the correct permissions that will allow them to be overwritten in the future.
Deleting the entire .julia folder is a much more drastic step that risks losing edits you've made to packages, and you'd need to re-install all the packages you've added.
The messages about recompiling a stale cache file are not warnings, but rather for your information. It means that something changed on your system and the current cache file is no longer considered valid, so instead of providing you a potentially old cachefile, Julia automatically cleans up and recompiles the cache.

git build number c#

I'm trying to embed git describe-generated version info into AssemblyInfo.cs plus some label within ASP.NET website.
I already tried using git-vs-versionino but this assumes Git executable on PATH. However default install of msysgit on Windows does not set this up; it uses git bash. This caused problems.
Now I am looking for a way to utilize libgit2sharp library (for zero external dependencies) to use as build number generator. However this library has no describe command...
Thanks!
git-describe is a UI feature that nobody has implemented in the library or bindings yet (or at least nobody's contributed it), but you can do it yourself fairly easily.
You get a list of the tags and what commits they point to, do a walk down the commits and count how many steps it took to get to a commit that you have in the list you built. This already gives you the information you need. If the steps were zero, then your description would be the tag name only; otherwise you append the number of steps and the current commit's id to it.
There's a work in progress libgit2 pull request that proposes an implementation of git-describe functionalities.
See #1066 for more information.
It's not finished yet. Make sure to subscribe to it in order to be notified of its future progress.
Once it's done, it should be quite easy to bind it and make it available through LibGit2Sharp.

Resources