How Visual Studio discover NuGets with source symbols? - .net-core

I prepared a symbol package successfully using dotnet pack's --include-symbols and --include-source switches. Now I wonder how to tell Visual Studio to use that package when trying to step into code of the corresponding non-symbol one.
I tried to place the symbol package to a local folder and configuring solution-level nuget.config file to use this folder as a package source. Idea is that there is maybe some name convention that looks for packages like {name}.symbols in all configured packages sources... but that doesn't work.
Oficial docs (especially the older ones) are talking a lot about configuring "Symbol Servers", but if I understand correctly, that's something different and older, right? If I wanted to set-up an internal symbol server, I wouldn't do that through NuGet. (I really don't want to set up an internal symbol server.)
They are also suggesting to push to smbsrc.net, but I can't do that with internal code, obviously. Also, I can't believe there are hard-coded URLs in .NET toolbox.

I didn't find a way to meaningfully use source included by dotnet build.
There are alternatives though:
SourceLink offers a way to configure mapping between source code build paths and HTTP locations. Unfortunately, that does not work for private repositories without a specific support for source control server authentication method. Bitbucket Server, for example, is not yet supported.
You can embed sources directly into PDBs. I will probably go that way.

Related

Add a logging entry to appsettings.json via a custom NuGet package

I have a NuGet package that I have created for my company. It works great. But it includes some code in the System.Net.Http.HttpClient.OAuthClient namespace that is very chatty in the logs at the Information level.
I would like to have my NuGet package automatically add the following line:
"System.Net.Http.HttpClient.OAuthClient": "Error"
At the end of the Logging->LogLevel section of the appsettings.json file of the project it is installed in (if its not already there).
Or failing that, is there a way to suppress these logs down to the Error LogLevel in code?
It's just my personal opinion, and it doesn't answer your question directly, but it did not fit into a comment, so posting it here.
I don't think it's up to your package to decide, what will be in the appsettings.json. There may be let's say production and test appsettings, and in test developers of some projects would actually want to see as many logs as possible.
If your package would modify it, you would take away the decision from developers who use your package. Or, imagine, the project that uses your package also used System.Net.Http.HttpClient.OAuthClient namespace, but for its own purposes. And developers again might want to see logs coming from that namespace.
I'd suggest just to put this info into your package's documentation, and than developers of each project shall decide for themselves, whether they want to suppress logs or not.
Update after comment:
Probably this will get you somewhere: http://www.roundcrisis.com/2014/08/03/Nuget-install-tricks/
The basic idea is that you can distribute init.ps1 script together with your nuget package that will run on installation of the package. And with powershell you could do pretty much anything, including replacing strings in text files.
Have a look also at this SO question: Execute an action after my nuget package is installed

Artifactory - Concept of File Versions

I'm currently starting with JFrog Artifactory. Up to now I have only been working with source code control systems not with binary repositories.
Can someone please tell how the versioning of files is done in Artifactory?
I have been trying to deploy a file, then change it and deploy it again.
The checksum has changed, so it's the new file. But it seems that the old version is gone.
So it looks like there are no version of files. If I want that do I have to do it in the filename?
I found versions related to packages.
But I was thinking to use it for other files as well.
Thanks for your help
Christoph
Artifactory, unlike a VCS system, is not managing a history of versions for a given path. When you deploy an artifacts over an existing artifact, it will overwrite it (you can block this by configuring the right permissions).
If you wish to manage permission for generic artifacts (ones which are not managed by a known package manager like npm, Maven etc.), there are a couple of options you can take:
Add the version as part of the artifact name, for example foo-1.0.0.zip
Add the version as part of the artifact path, for example /foo/1.0.0/foo.zip
Combine the 2 above approaches, for example /foo/1.0.0/foo-1.0.0.zip
Use an existing package management tool which is flexible enough to handle generic packages. Many people are using Maven to manage all types of packages beyond Java ones (it comes with its pros and cons)
From the Artifactory point of view there are a couple of capabilities you can leverage:
Generic repositories - aimed at managing proprietary packages which are not managed by a known package manager
Custom repository layout - can be used to define a custom layout for your generic repository and assist with tasks like automatic snapshot version cleanup
Properties - can be used to add version (and other) metadata to your artifacts which can used for searching, querying,resolution and more
Lastly, Conan is another option you should consider. Conan is a package manager intended for C and C++ packages. It is natively supported in Artifactory and can give you a more complete solution for managing your C libraries.

Build fails due to different package location

I'm new to realm db and tried to integrate it in our mobile app (Xamarin.Ios using Visual studio).
I've added the nuget to the PCL project as well as to the executable project
when building, the build fails with the following message:
1>C:\Projects\CoachApp-Fork\ExternalPackages\Realm.0.74.1\build\Realm.targets(6,5): error MSB3030: Could not copy the file "C:\Projects\CoachApp-Fork\Build\Solutions\packages/Realm.0.74.1/tools/RealmWeaver.Fody.dll" because it was not found.
There seems to be a build task in realm that tries to copy the dll, however in our solution, the nuget packages are located at a different location (configured via Nuget.Config).
any ideas how to resolve this? i cannot simply change the nuget package location, as the build server relies on this...
That was my clever trick you broke!
The problem is the location of the DLL for Fody is quite restrictive and yet the Xamarin templates have two different relative package directory locations depending on you having a multi-target project. (This was an improvement to cache packages per-solution, I think sometime in 2015.)
Can you put your own manual copy step into your build, at least for now, to put it there in advance? I realise this isn't an optimal solution but might work as a band-aid.

How to make Qt aware of the QMYSQL driver

I'm trying to access a MySql database from a Qt application but I get the following error:
QSqlDatabase: QMYSQL driver not loaded
QSqlDatabase: available drivers: QSQLITE QSQLITE2
I find this very strange cause I have libqsqlmysql.so on my Qt folder. I have even tried to compile the MySql driver as a static plugin and add it to my .pro file as:
QTPLUGIN += qsqlmysql
But this also generates the same runtime error (it must've found the plugin cause there's no error compiling the application)
What am I missing? I would like to avoid having to compile Qt from source cause this will have to work seamlessly on the deploy machines as well.
BTW: Even though I'm developing and testing on Linux I will need to support Windows. Will I experience this same issue on Windows? How can I compile and link the MySql driver in both Linux and Windows?
The solution:
After following #Sergey's recommendations I did an strace of the application redirecting the output to grep so I could search for 'mysql' and for my surprise the application wasn't looking for the plugin at QTDIR/plugins/sqldrivers where I had libqsqlmysql.so, it was looking at QTDIR/lib. After copying the plugin to the lib folder the MySql connection worked.
Try opening the shared library with dlopen() and see if it loads and if not, what dlerror() tells you. I always run into similar problems on Windows. LoadLibrary()/GetLastError() saved me numerous times (last time it was because of a wrong version of some libiconv/libintl DLL). Running ldd on the plugin may also help.
If dlopen() works fine, try to load the plugin with QPluginLoader. If it doesn't load, then check the buildkey of the plugin. I usually do it the dirty way by running strings on the plugin and then looking for strings like "buildkey" or "QT_PLUGIN_VERIFICATION_DATA". Just looking at the build key and around it may give you an idea. For example, you may realize that you have compiled your plugin in the release mode while your application is compiled in the debug mode. In such case the build key won't match and the plugin won't load. Everything in the build key must match your configuration. Note that the version and the build key are checked differently: the build key must match exactly (or match some black magic called QT_BUILD_KEY_COMPAT), but in the version only the major version must match exactly, the minor version must be the version of Qt the plugin was compiled with or later and the patch level is ignored. So if your plugin was compiled with Qt 4.x.y then it will work with Qt versions 4.z.* where z>=x. This actually makes sense.
If the build key looks okay (which is unlikely if you got to this point), you may wish to look at QLibraryPrivate::isPlugin() source code to figure out what's wrong, but that doesn't look like an easy task to me (although running this in a debugger may help).
If QPluginLoader does load the plugin, check if it is in the right directory and has correct permissions. If you still didn't solve the problem by this point, it's time to look at the SQL module source code that actually loads these plugins. But it is extremely unlikely. I ran into this problem many, many times and it was always either the library not loading or the build key not matching.
Another way to go after QPluginLoader loads the plugin successfully is to use strace to figure out whether the program at least tries to open the plugin file. Searching for something like "sqldrivers" or "plugins" in the strace output should also give away the directory where Qt is searching for its plugins and specifically SQL drivers.
Update
Is it possible to compile the driver as a static plugin and don't worry about anything? Let's try:
d:\Qt4\src\plugins\sqldrivers\psql>qmake CONFIG+=static LIBS+=-Ld:/programs/Post
greSQL/lib INCLUDEPATH+=d:/programs/PostgreSQL/include
d:\Qt4\src\plugins\sqldrivers\psql>make
It compiles fine and now I got libqsqlpsql.a (release) and libqsqlpsqld.a (debug) in QTDIR/plugins/sqldrivers (it is the right place on Windows). I am using PostgreSQL driver here, but I don't think it will be any different for MySQL which I just don't have installed. Ok, let's compile some real program with it:
d:\alqualos\pr\archserv>qmake QTPLUGIN+=qsqlpsql PREFIX=d:/alqualos LIBS+=-Ld:/g
nu/lib INCLUDEPATH+=d:/gnu/include LIBS+=-Ld:/programs/PostgreSQL/lib LIBS+=-lpq
Note that I had to manually link to libpq, otherwise the linker would complain about undefined references. The funny thing is, qmake knows that qsqlpsql is located in QTDIR/plugins/sqldrivers and sets compiler and linker options accordingly. So it still needs to be in the right place to work, only you don't have to worry about your users running into the same problem as it is only used during compilation. An alternative would be to just use LIBS+=-Lpath/to/plugin LIBS+=-lqsqlpsql instead of QTPLUGIN+=qsqlpsql, at least the docs say that it should work, but I haven't tested it.
In order for the application to actually use the plugin I had to put the following in my main unit (CPP file):
#include <QtPlugin>
Q_IMPORT_PLUGIN(qsqlpsql)
It works! Also, from what I've been able to figure out from the sources, the build key and the version are checked only when a plugin is dynamically loaded (all the relevant stuff is in the QLibrary's private class, not even QPluginLoader's). So the resulting executable may (or may not, depending on the binary compatibility) work even with different versions and builds of Qt, although using it with older versions may trigger some bugs that were fixed later.
It is also worth noting that the order for loading SQL drivers is this: use the driver statically linked into Qt if available, then look for a driver registered manually with QSqlDatabase::registerSqlDriver(), then look for a driver statically imported into the application (the way described above), and finally try to load a shared plugin. So when you link statically, your users won't be able to use dynamically linked drivers they may already have, but will be able to use drivers linked statically into Qt (like in Ubuntu).
I compiled QT first and then realised that I need mysql as well. So I compiled mysql plugin by
executing following command in QT-DIR\src\plugins\sqldrivers\mysql folder.
Mysql plugin compile command
qmake "INCLUDEPATH+=$$quote(C:\Program Files\MySQL\MySQL Server 5.5\include)" "LIBS+=$$quote(C:\Program Files\MySQL\MySQL Server 5.5\lib\libmysql.lib)" mysql.pro
Plugings are then created in created in folder QT-DIR\plugins\sqldrivers.
However, when I tried to use it in my code. It failed with following error.
Error msg
QSqlDatabase: QMYSQLDriver driver not loaded
Solution
After some googling and checking Path variable I realised that the Mysql server lib
( C:\Program Files\MySQL\MySQL Server 5.5\lib) directory was not in my Path variable. I expect that the dll in this folder are used by the plugin at runtime. After including Mysql server lib in Path variable everything worked smoothly. Hope this information saves some hair on other programmers scalp, as I uprooted quite a few. :D
Last time I looked at this you needed to rebuild Qt from source and include the appropriate MySQL source.
Building Qt from the sources is not hard, it just takes a while. You are likely to have the required tools already.
A possible workaround may be to access the back-end over ODBC instead.
In order for your app to pick up the plugin at runtime, the shared library implementing the MySQL plugin needs to be placed in the correct directory. The best way of determining that directory is to check the output of QCoreApplication::libraryPaths. You can also force specific paths by using a qt.conf file.
Please note that plugins must be placed in subdirectories within the plugin path, and the final part of the path name (i.e., the parent directory of the shared libraries) cannot be changed. SQL drivers need to go in a directory named sqldrivers, i.e. <pluginpath>/sqldrivers. For more details on plugin directories, see How to Create Qt Plugins.
I was experiencing this same issue as well. I've been installing and experimenting with a lot of different Python tools and UIs. I then uninstalled everything python related. I did a fresh install of Python 3.2, PyQT 3.2, and Eric5. No more errors with the QMySQL driver.
well i have had this issue, and after a lot of time, and different tools, i found that QT ( on windows, have not been able to test on Linux.) loads the "QSQLMYSQL.." when requested, but before runtime the lib ("QSQLMYSQL..") file must reside on one of the searched paths (QApp.libraryPaths()) inside a folder called "sqldrivers".. otherwise QT will just ignore the file, even if it is at some other point inside the searched path.
what i did was to monitor the dependency of a sample app, and when i removed the "QSQLMYSQL.." dll from "plugins\sqldrivers\" it failed, but when i maded a folder inside the app folder, called "sqldrivers" and placed the "QSQLMYSQL..." inside there, it loaded.
what i have is mysql 5.5, qt 4.7.4.
hope anyone can use this, and if anyone knows more about it, i would like to know where to find it(http://doc.qt.nokia.com/stable/sql-driver.html, is the closest you can get to the information about the folder structur). :P
This may also happen if your QMYSQL plugin is linked against the "wrong" mysql_client.a or it isn't in the LD_LIBRARY_PATH. I had this problem on OSX because mysql was installed via ports, and I fixed it with:
install_name_tool -change libmysqlclient.18.dylib /usr/local/mysql/lib/libmysqlclient_r.18.dylib libqsqlmysql.dylib

Automatic BizTalk Versioning in My Build Process

In all of my other .net apps my build process (a mixture of nant and custom tasks) automatically updates the [AssemblyVersionAttribute] AssemblyInfo.cs with the current build number before the call to msbuild, stamping in the build number in the version number.
I'm now working on my first BizTalk project and I'd like to do the same thing with the version numbers of the BizTalk assemblies, but I've run into trouble!
First of all the aseembly version numbers are stored in the btproj files, so I did some googling and found www.codeplex.com/biztalk which looked like the answer to my problem, but there is a deeper problem!
I have a project for my schemas and another for my pipelines, the pipelines project references my schemas project as I have a flat file dis/assemblers. The problem comes when I update the version numbers, as updating them even from within visual studio does not update the pipeline components references to the schemas.
So if I update all the version numbers manually in the VS IDE from 1.0.0.0 to 1.1.0.0, the build fails as the pipeline components flat file dis/assemblers still reference the old 1.0.0.0 version of the schemas! They don't automatically update!
Is this really a manual process of updating the version numbers of the BizTalk projects in the property pages, then building the projects and manually updating the references to them in the properties of all the pipeline components that reference them?
This means that I can't have my build process control the build number part of my version numbers!
Or is there a better method of managing the version numbers of the BizTalk assemblies?
I'm sorry to disappoint you but I've been down the exact some road I had to give up. I guess it could be possible to achieve it but it would require a lot of changes to both the binding files and other XML files (as you mentioned and even more if you have published services etc).
Maybe it could be possible to wrap all these necessary changes in a build step (a MSBuild step or similar in other build frameworks) - that would be useful!
Developer- :)
We had the similar problem and we ended up developing a small utility which would change the version number in all the projects i.e. *.csproj (asssemblyinfo.cs), *.btproj accordingly. Apart from this it would open and modify the *.btp files with the new version of schemas. In nutshell, what all you have to do is to configure this utility in your VS.net tools menu and execute it.
I guess its not very difficult to develop such utility in any .net lanagauge.
Caveat: Do not forget to save the files after updates with the same encoding as they were originally.
Cheers!
Gutted, thought that might be the case. Maybe BizTalk 2009 projects will play more nicely when updating references when changing version numbers.
I started to go through and automate it manually, and when I realised what needed to be done, I took a biiig step back when I realised just how many places I'd have to modify to get it working. Thank god for Undo Checkout.
I do have a standard C# class library included in my project (various helper functions), which i am able to update the version number of during my build process, so I'm basically using that one assembly to version the whole application. If anyone wants to know what version is in any environment, check out the version number of that one assembly.
Not ideal, but it's working.
We've done this successfully on our project - I'll see if I can get the developer of the tool to post details...
This problem arises when you perform an integration build to the latest versions of your dependent components as file references (aka schemas here).
Keep in mind that upgrading the assemblyversion must always performed manually, that way you are always in charge of changes to assemblyversions.
A possible solution to solve the buildbreaks issue is to file reference to a specific version of a dependent component build and not to the latest version and use a subst drive and a copy script to get the latest component builds.
For example:
SchemaA, assembly version 1.0.0.0
PipelineA (with pipelinecomponent XMLValidator for example), assembly version 1.0.0.0
PipelineA has a file reference to a subst drive(say R drive, which maps to a workspace D:\MyComponents) and version 1.0.0.0 of SchemaA as follows:
R:\SchemaA\1.0.0.0\SchemaA.dll.
The copy-script copies the buildoutput of SchemaA locally to your R drive.
When schema A updates to version 1.1.0.0 you don't have any issues because you still use version 1.0.0.0 and YOU have the choice to use the 1.1.0.0 version of your schema. When you want to upgrade, you have to alter your copy-script and replace the file reference to R:\SchemaA\1.1.0.0\SchemaA.dll.

Resources