Can the GNU TLS library be built without gtk-doc? - gnu-make

The title more or less says it all.
From what digging I've managed to do, the answer to if this is possible seems to be "no", but the whole GNU build tools platform is something I know very little about, so I could easily be missing some detail.
Context
I'm trying to set up an as-hermetic-as-I-can build of something that needs gnutls and I'm trying to minimize the list of things that need to be installed prior to the build (i.e. don't use the thing, or build the thing from source). One attribute of the build process is that the only accessible build artifact from gnutls will the absolute minimum needed to call and link it in (more or less the .a files and includes directory). So even if documentation is generated, it won't ever be accessible. As such, a dependency on a doc generation tool is something I strongly want to avoid.
What I've dug up
It seems, based on this merge-request comment, that gnutls requires that that gtkdocize be installed regardless of if it will be used. The context there seems to imply that this is by design to make the gnutls development process (as opposed to the use of gnutls) less error prone. (I'm really hoping I'm misreading things or that something has changed since then.)
This seems like an odd choice as it take a problem only experienced by developers and solves it by requiring anyone who wants to build the library (even as a transitive dependency) to assume that dependency even if they will actively suppress using it. (This seems particularly odd given this expands the potential attack surface for a security project.)

Related

What is the best practice for handling composer abandoned packages?

When I run composer updates I'll occasionally get messages that packages are abandoned and I should use a different one instead, like Package webflo/drupal-core-require-dev is abandoned, you should avoid using it. Use drupal/core-dev instead. I don't have experience with Composer so I'm curious as to what is seen as the best practice for replacing outdated packages.
Where do these messages come from? I'm unsure if the source is always reliable.
I think the best practice is quite clear from the message "you should avoid using it". How/When to do this is not as clear. Abandoned packages will not receive updates, but composer will not be able to tell you how difficult it will be to transition to the recommended alternative. It might be that all you have to do is replace the package, because it was only a name change or having to modify your code as well.
In your case webflo/drupal-core-require-dev only contains a composer.json and the required packages match with what the alternative drupal/core-dev provides. That means replacing the package should be as easy as changing the name in your composer.json and then do a composer update drupal/core-dev.
For packages where the answer is not as straightforward, you have to rely on automated/manual tests to see if everything still works. Static code analysis tools might help as well. You will have to set them up before you do the change, so that you can see how their output differs and fix the new issues that come up.
You should do the switch to the new dependency as early as possible. Leaving it in will likely cause more work in the future when replacing it and might pose a security risk (if it is outdated and insecure). I understand that this is not always possible and using something like roave/security-advisories to tell you when there are known security issues in a package might help postponing it and giving some sense of security.

Proper manage of private nuget packages dependencies between solutions

We have 2 solutions.
SolutionA is an internal solution where we put reusable code through our products
For the sake of the question, it has only two projects NugetProjectA and NugetProjectB which has a project reference to NugetProjectA.
SolutionB its a solution that has package references towards SolutionA via nuget.
The thing that troubles is:
add new method added in NugetProjectA
add new method in NugetProjectB project that uses previous method
publish new version of NugetProjectB
update nuget reference on Project of SolutionA
execute in Project newly added method of NugetProjectB
Since we didn't publish the NugetProjectA updated version, last step described will fail.
This seems to be a easy problem to solution. But imagine this with many more projects in SolutionB and many more in SolutionA.
Your question is vague and open-ended, so there isn't a simple, concise answer. Honestly it could easily be a multi-post blog series or even a short book. I'll try to give some general suggestions, but I'm not going to go into a lot of details to avoid making this answer too long.
Mono-repo vs multiple repos
Just like the microservice craze from a few years ago, you should first ask yourself if you need it. Having all your source code in a single repository and solution might make it feel "legacy", and it sure seems nicer to have a 3 minute build of a component rather than a 30 minute build of a whole system when checking in a 1-line bug fix. But are the problems you listed in your question worth the benefit of a shorter CI build?
On the other extreme, Google may famously run almost everything out of a single repository, but they have teams of people doing nothing more than managing their mono-repo and build system because a large number of developers working on a single repo have a different set of problems. These are engineers not working on customer applications. If their work make other teams more productive, then it can be worthwhile.
You need to figure out what's best for you, but given the problem you described in the question, maybe you have too many repos/solutions and your processes could be faster with fewer mistakes if you consolidate a little.
Automation
The next best thing is to just use good engineering practises. Automate as much as possible to reduce the risk of human mistakes, including automating processes that validate that manual processes are followed correctly.
Have the a CI or CD pipeline push packages, so you can't forgot to push dependencies, like in your example.
There are tools that will automatically generate a version number from the git history based on the number of commits since the versioning config file was checked in. This prevents you from forgetting to increment the package version when you check in a change to the package.
Have tests to make sure the package works before pushing it. You could make it as simple as making sure that dependencies exist, but you could also go further and have test programs run that use the package to ensure that backwards compatibility isn't broken and that new features actually work as expected. Have these tests run before pushing the package, so if the tests fail, you don't push a bad package.
You can have a pipeline that will automatically create pull requests on solutions that use a package when a new version of one of your packages is published. You can even automatically merge it, but you'll want to make sure you have real good tests to avoid a buggy package cascading into a huge mess, particularly if you do automatic deployments on successful builds.
Be creative and think about other ways you can automate your processes to make things easier for yourself and your team.
But be pragmatic about what you automate. There's no point spending a week full time to automate something that only takes you 5 minutes once a month to do manually. But if the manual process sometimes goes wrong and causes hours or days of effort to fix, then that makes automating the process more worthwhile.
Use modern features
The .NET ecosystem has changed a lot in the last 2-3 years since .NET Core was announced. Now packing with MSBuild (either dotnet pack or msbuild -t:pack) is easier to create packages than creating .nuspec files and making sure you do the right things to get project dependencies packed as nuget dependencies, getting all files in the right places, etc. If your class library uses SDK style projects, then there's nothing extra to do. If your project is a traditional project, you'll need to use PackageReference for your NuGet dependencies (or specify <ProjectStyle>PackageReference</ProjectStyle> as an MSBuild property in your project file), and then reference the NuGet.Build.Tasks.Pack package.
Version the application, not each package
Like the mono-repo vs multiple repos point, considering versioning all packages in the application with a single version number, rather than versioning each package individually. Yes, this means you'll sometimes (or maybe often) publish a new version of a package that doesn't have any code changes to the previous version, but it simplifies the release considerably. Coupled with packing with MSBuild in the section above, you can create a Directory.Build.props file in your repository root, and set the <Version> property to your app version, and all projects in the repo will have the same version. So, when you're ready to release, bump the version in a single file and every project and every NuGet packages will have the same version.
Summary
In an ideal world each component would be reusable in different applications, in a separate source code repository, each package individually versioned using semantic versioning. But in the real world this adds a lot of development time complexity. Your customers may be happier to get bug fixes and new features more quickly, even if the version number of packages are less meaningful. So, make data driven decisions. If you're frequently having dependency version problems, reduce your dependencies so there are fewer things that can go wrong.
Don't get me wrong, there are many good reasons to have multiple projects, multiple solutions, multiple repositories. Just make sure that the reason you're doing them is because it helps your team/company be more productive, not for idealistic reasons that are slowing you down or causing bugs.

Realm and RxSwift connectivity

I've been looking at options for persistence when using RxSwift and Realm was looking attractive due to it's relative simplicity and the availability of some extensions in the community repo.
Unfortunately although I can get Realm and RxSwift working nicely in Xcode 8b6, things of seriously wrong as soon as you try to connect them together as RxRealm does not currently compile (there seems to be more going wrong with it than the Grand Renaming as far as I can tell).
Is there a workaround that is reliable? I can't believe for a moment that there isn't, I just can't find a resource at present. I was thinking of converting the Result object into an Set or Array and making this Observable but. I'm not sure if the contents (Realm Objects) are going to be handled correctly. Knowing my luck, I suspect not!
There's a Pull Request towards the RxRealm project adding Swift 3 support: https://github.com/RxSwiftCommunity/RxRealm/pull/26
I suggest you try using that.
More generally, targeting an Xcode beta will by definition give you a less stable software ecosystem, since no one is submitting apps with that and it's a moving target (often with weekly breaking changes). So if you want stable software, use stable tools. Realm and RxRealm both support Swift 2.2 quite well, so using that will give you the best experience.

How do small software patches correct big software?

One thing I've always wondered about is how software patches work. A lot of software seems to just release new versions on their binaries that need to be installed over older versions, but some software (operating systems like Windows in particular) seem to be able to release very small patches that correct bugs or add functionality to existing software.
Most of the time the patches I see can't possibly replace entire applications, or even small files that are used within applications. To me it seems like the actual binary is being modified.
How are these kinds of patches actually implemented? Could anyone point me to any resources that explain how this works, or is it just as simple as replacing small components such as linked libraries in an application?
I'll probably never need to do a deployment in this manner, but I am curious to find out how it works. If I'm correct in my understanding that patches can really modify only portions of binary files, is this possible to do in .NET? If it is I'd like to learn it since that's the framework I'm most familiar with and I'd like to understand how it works.
This is usually implemented using binary diff algorithms -- diff the most recently released version against the new code. If the user's running the most recent version, you only need to apply the diff. Works particularly well against software, because compiled code is usually pretty similar between versions. Of course, if the user's not running the most recent version you'll have to download the whole thing anyway.
There are a couple implementations of generic binary diff algorithms: bsdiff and xdelta are good open-source implementations. I can't find any implementations for .NET, but since the algorithms in question are pretty platform-agnostic it shouldn't be too difficult to port them if you feel like a project.
If you are talking about patching windows applications then what you want to look at are .MSP files. These are similar to an .MSI but just patch and application.
Take a look at Patching and Upgrading in the MSDN documents.
What an .MSP files does is load updated files to an application install. This typically is updated dll's and resource files, but could include any file.
In addition to patching the installed application, the repair files located in C:\WINDOWS\Installer are updated as well. Then if the user selects "Repair" from Add / Remove programs the updated patch files are used as well.
I'm thinking that the binary diff method discussed by John Millikin must be used in other operating systems. Although you could make it work in windows it would be somewhat alien.

When should one use a project reference opposed to a binary reference?

My company has a common code library which consists of many class libary projects along with supporting test projects. Each class library project outputs a single binary, e.g. Company.Common.Serialization.dll. Since we own the compiled, tested binaries as well as the source code, there's debate as to whether our consuming applications should use binary or project references.
Some arguments in favor of project references:
Project references would allow users to debug and view all solution code without the overhead of loading additional projects/solutions.
Project references would assist in keeping up with common component changes committed to the source control system as changes would be easily identifiable without the active solution.
Some arguments in favor of binary references:
Binary references would simplify solutions and make for faster solution loading times.
Binary references would allow developers to focus on new code rather than potentially being distracted by code which is already baked and proven stable.
Binary references would force us to appropriately dogfood our stuff as we would be using the common library just as those outside of our organization would be required to do.
Since a binary reference can't be debugged (stepped into), one would be forced to replicate and fix issues by extending the existing test projects rather than testing and fixing within the context of the consuming application alone.
Binary references will ensure that concurrent development on the class library project will have no impact on the consuming application as a stable version of the binary will be referenced rather than an influx version. It would be the decision of the project lead whether or not to incorporate a newer release of the component if necessary.
What is your policy/preference when it comes to using project or binary references?
It sounds to me as though you've covered all the major points. We've had a similar discussion at work recently and we're not quite decided yet.
However, one thing we've looked into is to reference the binary files, to gain all the advantages you note, but have the binaries built by a common build system where the source code is in a common location, accessible from all developer machines (at least if they're sitting on the network at work), so that any debugging can in fact dive into library code, if necessary.
However, on the same note, we've also tagged a lot of the base classes with appropriate attributes in order to make the debugger skip them completely, because any debugging you do in your own classes (at the level you're developing) would only be vastly outsized by code from the base libraries. This way when you hit the Step Into debugging shortcut key on a library class, you resurface into the next piece of code at your current level, instead of having to wade through tons of library code.
Basically, I definitely vote up (in SO terms) your comments about keeping proven library code out of sight for the normal developer.
Also, if I load the global solution file, that contains all the projects and basically, just everything, ReSharper 4 seems to have some kind of coronary problem, as Visual Studio practically comes to a stand-still.
In my opinion the greatest problem with using project references is that it does not provide consumers with a common baseline for their development. I am assuming that the libraries are changing. If that's the case, building them and ensuring that they are versioned will give you an easily reproducible environment.
Not doing this will mean that your code will mysteriously break when the referenced project changes. But only on some machines.
I tend to treat common libraries like this as 3rd-party resources. This allows the library to have it's own build processes, QA testing, etc. When QA (or whomever) "blesses" a release of the library, it's copied to a central location available to all developers. It's then up to each project to decide which version of the library to consume by copying the binaries to a project folder and using binary references in the projects.
One thing that is important is to create debug symbol (pdb) files with each build of the library and make those available as well. The other option is to actually create a local symbol store on your network and have each developer add that symbol store to their VS configuration. This would allow you to debug through the code and still have the benefits of usinng binary references.
As for the benefits you mention for project references, I don't agree with your second point. To me, it's important that the consuming projects explicitly know which version of the common library they are consuming and for them to take a deliberate step to upgrade that version. This is the best way to guarantee that you don't accidentally pick up changes to the library that haven't been completed or tested.
when you don't want it in your solution, or have potential to split your solution, send all library output to a common, bin directory and reference there.
I have done this in order to allow developers to open a tight solution that only has the Domain, tests and Web projects. Our win services, and silverlight stuff, and web control libraries are in seperate solutions that include the projects you need when looking at those, but nant can build it all.
I believe your question is actually about when projects go together in the same solution; the reason being that projects in the same solution should have project references to each other, and projects in different solutions should have binary references to each other.
I tend to think solutions should contain projects that are developed closely together. Such as your API assemblies and your implementations of those APIs.
Closeness is relative, however. A designer for an application, by definition, is closely related to the app, however you wouldn't want to have the designer and the application within the same solution (if they are at all complex, that is). You'd probably want to develop the designer against a branch of the program that is merged at intervals further spaced apart than the normal daily integration.
I think that if the project is not part of the solution, you shouldn't include it there... but that's just my opinion
I separate it by concept in short

Resources