C# Get all types from referenced libraries - reflection

I'm trying to build a dependency management tool. So I need to get all the classes from all the libraries.
There are three projects written in C#, .NET Core 3.0.
ProjectA
ProjectB
ProjectC
ProjectA is an executable console app.
And ProjectA references ProjectB. ProjectB references ProjectC.
ProjectA -> ProjectB -> ProjectC
For code in ProjectC:
// In ProjectC
Assembly.GetExecutingAssembly().GetTypes()
// That can list all classes in ProjectC.
And for code in ProjectC:
// In ProjectC
Assembly.GetEntryAssembly().GetTypes()
// That can list all classes in ProjectA.
But how can I list all classes in ProjectB? (Code in ProjectC, run by ProjectA)

I have got a solution:
Just start building a dependency tree from entry assembly:
// Code this in project C. Run this by project A.
// Entry is A.
var entry = Assembly.GetEntryAssembly();
entry
// This will get B and C with type: `AssemblyName`
.GetReferencedAssemblies()
// Load assembly B and C.
.Select(t => Assembly.Load(t))
// Get all class in B and C.
.SelectMany(t => t.GetTypes())
.ToList();
The way to get referenced assemblies is:
Assembly.GetReferencedAssemblies Method
https://learn.microsoft.com/en-us/dotnet/api/system.reflection.assembly.getreferencedassemblies?view=netframework-4.8

Related

aspnet core appsettings.json loading

I have ASP.NET Core (2.1) project that has appsettings.json. I use WebHost.CreateDefaultBuilder(). The appsettings.json file has following configuration in File Properties:
Build Action: Content
Copy to Output Directory: Do not copy
After build the appsettings.json ends up in bin\Debug\netcoreapp2.1\MyProj.runtimeconfig.json.
The ASP.NET Core runtime loads it fine.
I created WebJobs (for .Net Core 2.1) and wanted to do the same - set Build Action to Content and let it loaded. In the Main() of Program.cs I have code like
var builder = new HostBuilder()
...
.ConfigureAppConfiguration(b =>
{
var environment = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT");
b.SetBasePath(Directory.GetCurrentDirectory());
b.AddJsonFile("appsettings.json", false, true);
b.AddJsonFile($"appsettings.{environment}.json", true, true);
b.AddEnvironmentVariables();
// Adding command line as a configuration source
if (args != null)
{
b.AddCommandLine(args);
}
}
But the runtime tries to load appsettings.json (instead of MyWebJobProj.runtimeconfig.json). So I had to set Build Action to None and Copy to Output Directory to Always.
However I would prefer the same approach like in ASP.NET Core - it handles somehow the file name transformation. Although in WebHost.CreateDefaultBuilder() is basically the same code like I have in my WebJob. What does the magic file name transformation in the configuration and why it works only in one type of project?
The file [ProjName].runtimeconfig.json has a completely different meaning than appsettings.json. Ensure that appsettings.json was copied to output (set 'Copy to output' to 'always' or 'newer').

How to step inside/debugging the code source of web.api 2 and System.Web dll

I wanted to step inside/debugging the web-api 2 (version 5.2.3) code source to understand it.
I've created a simple web api application, and in the WebApi.config, i've setted a break point at this line :
config.Routes.MapHttpRoute(
name: "ExcludedRoute",
routeTemplate: "api/{controller}/{action}",
defaults: new { id = RouteParameter.Optional },
constraints: new { Controller = "healthcheck", action = "check" }
);
I could step inside the MapHttpRoute method and the other classes, but the debugger couldn't step inside the Route class from the System.Web dll, below the image:
I tried to use this symbol file locations:
I've also used the dotpeek tool for generating the pdb files from the bin folder, by setting to true the "copy local" feature of the referenced assemblies, but with no success. The path of system.web dll is
C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework.NETFramework\v4.5\System.Web.dll
I'm wondering Why the debugger couldn't step inside the Route class wich belongs to the system.Web dll, its version is 4.0 and the version of web.api is the 5.2.3.
In my searching, i've downloaded the code source of the asp.net and i found that the project System.Web.Http.WebHost which contains the HttpWebRoute class.
Any idea?
Thanks!
Assembly System.Web.pdb was in the .NET Framework source code. You need to configure Visual Studio for debugging .NET framework in the Tools -> Options -> Debugging -> General menu.
Reference:
How do I debug .NET 4.6 framework source code in Visual Studio 2017?

How to publish an artifact with pom-packaging in SBT?

I have a multi-project build in SBT where some projects should aggregate dependencies and contain no code. So then clients could depend on these projects as a single dependency instead of directly depending on all of their aggregated dependencies. With Maven, this is a common pattern, e.g. when using Spring Boot.
In SBT, I figured I can suppress the generation of the empty artifacts by adding this setting to these projects:
packagedArtifacts := Classpaths.packaged(Seq(makePom)).value
However, the makePom task writes <packaging>jar</packaging> in the generated POM. But now that there is no JAR anymore, this should read <packaging>pom</packaging> instead.
How can I do this?
This question is a bit old, but I just came across the same issue and found a solution. The original answer does point to the right page where this info can be found, but here is an example. It uses the pomPostProcess setting to transform the generated POM right before it is written to disk. Essentially, we loop over all the XML nodes, looking for the element we care about and then rewrite it.
import scala.xml.{Node => XmlNode, NodeSeq => XmlNodeSeq, _}
import scala.xml.transform._
pomPostProcess := { node: XmlNode =>
val rule = new RewriteRule {
override def transform(n: XmlNode): XmlNodeSeq = n match {
case e: Elem if e != null && e.label == "packaging" =>
<packaging>pom</packaging>
case _ => n
}
}
new RuleTransformer(rule).transform(node).head
},
Maybe you could modify the result pom as described here: Modifying the generated POM
You can disable publishing the default artifacts of JAR, sources, and docs, then opt in explicitly to publishing the POM. sbt produces and publishes a POM only, with <packaging>pom</packaging>.
// This project has no sources, I want <packaging>pom</pom> with dependencies
lazy val bundle = project
.dependsOn(moduleA, moduleB)
.settings(
publishArtifact := false, // Disable jar, sources, docs
publishArtifact in makePom := true,
)
lazy val moduleA = project
lazy val moduleB = project
lazy val moduleC = project
Run sbt bundle/publishM2 to verify the POM in ~/.m2/repository.
I dare say this is almost intuitive, a rare moment of pleasant surprise with sbt 😅
I confirmed this with current sbt 1.3.9, and 1.0.1, the oldest launcher I happen to have installed on my machine.
The Artifacts page in the reference docs may be helpful, perhaps this trick should be added there.

Defining plugin dependency between subprojects in SBT?

EDIT:
Since I put up the bounty, I thought I should restate the question
How can a SBT project P, with two sub-projects A and B, set up B to have a plugin dependency on A, which is a SBT plugin?
Giving P a plugin dependency on A does not work, since A depends on other things in P, which results in a circular dependency graph
It has to be a plugin dependency, for A is a plugin needed to run Bs test suite.
dependsOn doesn't work, because, well, it has to be a plugin dependency
I'd like to know either of
How to do this, or
Why this is impossible, and what the next best alternatives are.
EDIT: clarified that it's a plugin-dependency, since build-dependency is ambiguous
When you have a multi-project build configuration with "project P and two sub-projects A and B" it boils down to the following configuration:
build.sbt
lazy val A, B = project
As per design, "If a project is not defined for the root directory in the build, sbt creates a default one that aggregates all other projects in the build." It means that you will have an implicit root project, say P (but the name is arbitrary):
[plugin-project-and-another]> projects
[info] In file:/Users/jacek/sandbox/so/plugin-project-and-another/
[info] A
[info] B
[info] * plugin-project-and-another
That gives us the expected project structure. On to defining plugin dependency between B and A.
The only way to define a plugin in a SBT project is to use project directory that's the plugins project's build definition - "A plugin definition is a project in <main-project>/project/." It means that the only way to define a plugin dependency on the project A is to use the following:
project/plugins.sbt
addSbtPlugin("org.example" % "example-plugin" % "1.0")
lazy val plugins = project in file(".") dependsOn(file("../A"))
In this build configuration, the plugins project depends on another SBT project that happens to be our A that's in turn a plugin project.
A/build.sbt
// http://www.scala-sbt.org/release/docs/Extending/Plugins.html#example-plugin
sbtPlugin := true
name := "example-plugin"
organization := "org.example"
version := "1.0"
A/MyPlugin.scala
import sbt._
object MyPlugin extends Plugin
{
// configuration points, like the built in `version`, `libraryDependencies`, or `compile`
// by implementing Plugin, these are automatically imported in a user's `build.sbt`
val newTask = taskKey[Unit]("A new task.")
val newSetting = settingKey[String]("A new setting.")
// a group of settings ready to be added to a Project
// to automatically add them, do
val newSettings = Seq(
newSetting := "Hello from plugin",
newTask := println(newSetting.value)
)
// alternatively, by overriding `settings`, they could be automatically added to a Project
// override val settings = Seq(...)
}
The two files - build.sbt and MyPlugin.scala in the directory A - make up the plugin project.
The only missing piece is to define the plugin A's settings for the project B.
B/build.sbt
MyPlugin.newSettings
That's pretty much it what you can do in SBT. If you want to have multi-project build configuration and have a plugin dependency between (sub)projects, you don't have much choice other than what described above.
With that said, let's see if the plugin from the project A is accessible.
[plugin-project-and-another]> newTask
Hello from plugin
[success] Total time: 0 s, completed Feb 13, 2014 2:29:31 AM
[plugin-project-and-another]> B/newTask
Hello from plugin
[success] Total time: 0 s, completed Feb 13, 2014 2:29:36 AM
[plugin-project-and-another]> A/newTask
[error] No such setting/task
[error] A/newTask
[error] ^
As you may have noticed, newTask (that comes from the plugin from the project A) is available in the (default) root project and the project B, but not in A.
As Jacek said, it cannot be done as I would like, as a subproject cannot have a SBT plugin that the root project does not. On the other hand, this discussion on the mailing list contains several alternatives, and would no doubt be useful to anyone who comes across this question in the future.
EDIT: Well, in the end the alternatives mentioned (sbt scripted, etc) were hard and clunky to use. My final solution was to just have a separate project (not subproject) inside the repo that depends on the original project via it's ivy coordinates, and using bash to publishLocal the first project, going into the second project and running its tests
sbt publishLocal; cd test; sbt test; cd ..
I always thought the point of something like SBT was to avoid doing this kind of bash gymnastics, but desperate times call for desperate measures...
This answer may include the solution https://stackoverflow.com/a/12754868/3189923 .
From that link, in short, set exportJars := true and to obtain jar file paths for a (sub)project exportedProducts in Compile.
Leaving the facts about plugins by side, you have a parent project P with sub-projects A and B. And then you state that A depends on P. But P is a aggregate of A and B and hence depends on A. So you already have a circular dependency between A and P. This can never work.
You have to split P in two parts: The part where A depends on (let's call this part A') and the rest (let's call this P_rest). Then you throw away P and make a new project P_rest consisting of A', A and B. And A depends on A'.

How to create a basic project setup using sbt-native-packager

I have a project setup working with SBT to the point of creating sub-project artifacts.
I have been searching for a way to create a JAR file that contains sub-project JAR files along with some meta information. Based on suggestions, I looked at sbt-native-packager and it seems to have the capabilities I need.
I am wondering if someone would be willing to help me along this path by providing tips on creating a skeleton package specification for the plugin.
I think my configuration is pretty simple.
What I want to end up with is a JAR file with the following contents:
/manifest.xml
module.xml
modules/sub-project-one.jar
sub-project-two.jar
sub-project-three.jar
Both the manifest.xml and module.xml files will be generated from project information. The name of the resulting JAR file will be based on the name of the root project, it's version and a suffix "nkp.jar" (e.g. overlay-1.0.1.nkp.jar).
Thanks in advance for any help getting me going with this.
-- Randy
Here's the basics of what you want:
val createManifestXml = taskKey[File]("Creates the packaged maniest.xml")
val createModuleXml = taskKey[File]("Creates the module.xml file.")
// TODO - Define createManifestXml + createModuleXml
mappings in Universal ++= {
Map(createManifestXml.value -> "manifest.xml"
module.xml -> "module.xml")
}
mappings in Universal ++= {
val moduleJars =
Seq((packageBin in subProjectOne).value,
(packageBin in subProjectTwo).value,
(packageBin in subProjectThree).value)
moduleJars map { jar =>
jar -> s"module/${jar.getName}"
}
}
This will insure that the tgz/txz/zip can be generated. You can then either use the "generic" universal -> msi and universal -> rpm/deb mappings or create that mapping by hand if you desire.
Hope that helps!

Resources