From a Windows command line, I'd like to be able to publish to an RSS feed. I visualize something like this:
rsspub #builds "Build completed without errors."
Then, someone could go to my computer:
http://xp64-Matt:9090/builds/rss.xml
And there'd be a new entry with the date and time and the simple text "Build completed without errors."
I'd like the feed itself to run on a different port, so I'm not fighting with IIS or Apache, or whatever else I need to run on my computer on a day-to-day basis.
Does anything like this exist?
Here's a simple .Net 3.5 C# program that will create an RSS XML file that you can store in your IIS webroot:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Xml;
using System.IO;
namespace CommandLineRSS
{
class Program
{
static void Main( string[] args )
{
var file = args[ 0 ];
var newEntry = args[ 1 ];
var xml = new XmlDocument();
if ( File.Exists( file ) )
xml.Load( file );
else
xml.LoadXml( #"<rss version='2.0'><channel /></rss>" );
var xmlNewEntry = Create( (XmlElement)xml.SelectSingleNode( "/rss/channel" ), "item" );
Create( xmlNewEntry, "title" ).InnerText = newEntry;
Create( xmlNewEntry, "pubDate" ).InnerText = DateTime.Now.ToString("R");
xml.Save( file );
}
private static XmlElement Create( XmlElement parent, string tag )
{
var a = parent.OwnerDocument.CreateElement( tag );
parent.AppendChild( a );
return a;
}
}
}
Then you can call it like this:
CommandLineRSS.exe c:\inetpub\wwwroot\builds.xml "Build completed with errors."
Related
I have a requirement to store json in a single line(without any formatting) inside a blob storage file. I am using azure function with Newtonsoft.JSon properties for some manipulation purpose and then writing to a blob . But when I try to using JToken.Parse I am getting exception or internal server error.Below is the code I am using:
#r "Newtonsoft.Json"
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using System.Linq;
using System.Threading.Tasks;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
public static async Task<IActionResult> Run(HttpRequest req,TextWriter outputBlob,ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
log.LogInformation($"Response is {requestBody}");
dynamic jObject = JsonConvert.DeserializeObject(requestBody);
JToken jCategory = jObject;
var clus = jCategory["clusters"];
foreach(JObject item in clus)
{
var custom_tag=item["custom_tags"];
var app_logical_name = item.SelectToken("custom_tags.app_name");
item.SelectToken("init_scripts_safe_mode").Parent.AddAfterSelf(new JProperty("app_logical_name",app_logical_name));
}
var clus2 = JsonConvert.SerializeObject(jCategory,Formatting.None);
//var clus_new=JArray.Parse(clus).toString(Newtonsoft.Json.Formatting.None);
outputBlob.Write(clus2);
// outputBlob.Write(clus_new);
return new OkObjectResult(requestBody);
}
I have tried both ways but both are giving runtime errors. I just need to put the json in a single line(without any formatting) and write to blob.Can you please help me in this ?
This is the structure on my side:
{
"clusters":[
{
"custom_tags":{
"app_name": "appname1"
},
"init_scripts_safe_mode":{
"xxx": "yyy"
}
},
{
"custom_tags":{
"app_name": "appname2"
},
"init_scripts_safe_mode":{
"xxx2": "yyy2"
}
}
],
"test":"333"
}
And this is my code:
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
namespace FunctionApp3
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
log.LogInformation($"Response is {requestBody}");
dynamic jObject = JsonConvert.DeserializeObject(requestBody);
JToken jCategory = jObject;
var clus = jCategory["clusters"];s2 = JsonConvert.SerializeObject(jCategory, Formatting.None);
foreach (JObject item in clus)
{
var custom_tag = item["custom_tags"];
var app_logical_name = item.SelectToken("custom_tags.app_name");
var xxx = item.SelectToken("init_scripts_safe_mode");
xxx.Parent.AddAfterSelf(new JProperty("app_logical_name", app_logical_name));
log.LogInformation(JsonConvert.SerializeObject(custom_tag, Formatting.None));
log.LogInformation(JsonConvert.SerializeObject(app_logical_name, Formatting.None));
}
return new OkObjectResult(clus);
}
}
}
It seems no problem:
If you get the server side error, please check the details log to get where is the error comes from.
The 500 error is not helpful to solve this problem, you need to check the specific error of the azure function. You can use application insights to get the details error. The function must configure the corresponding application insights before you can view the log on the portal.
So you need to configure an application insights to your function app like this:
Then your function app will restart.
Of course, you can also go to kudu to view:
First, go to advanced tools, then click 'GO',
Then After you go to kudu, click Debug Console -> CMD -> LogFiles -> Application -> Functions -> yourtriggername. You will find log file there.
If you are based on linux OS, after go to kudu, just click 'log stream'(this is not supportted to consumption plan for linux.).
I have this code I have done using the application console and I want to use it inside my website using asp.net mvc5 in order to store the data it brings from the RSS feeds and store it in a database
using System;
using System.Collections.Generic;
using System.Linq;
using System.Xml;
using System.Xml.Linq;
using System.Text;
using System.Threading.Tasks;
using System.ComponentModel.DataAnnotations;
using System.Data.Entity;
using CodeHollow.FeedReader;
using CodeHollow.FeedReader.Feeds;
using System.Data;
using System.Data.SqlClient;
using System.Data.Entity.Core;
using System.Data.Entity.Infrastructure;
namespace RssFeedBackEnd
{
public class Program
{
public static void Main(string[] args)
{
RssFeedDB db = new RssFeedDB();
var qry = (from c in db.Links select c);
foreach (var li in qry)
{
try
{
var feed = FeedReader.Read(li.adressLink);
var item = feed.Items;
if (feed.Type.ToString() == "Rss_2_0")
{
//New n = new New();
foreach (var items in item)
{
string v = items.Id;
int h = validate(v);
if (h == 0)
{
try
{
New n = new New();
n.TitleNews = items.Title;
n.LinkNews = items.Link;
n.TitlePage = feed.Title;
n.LinkPage = feed.Link;
n.Linkimg = items.SpecificItem.Element.Element("enclosure").Attribute("url").Value;
n.IDurl = items.Id;
n.Pubdate = DateTime.Now;
n.EntryTime = DateTime.Now;
n.Description = items.Description;
n.IdCatogrey = li.CategorayID;
//n.NameCatogrey = li.Category.NameCategory.ToString();
db.News.Add(n);
}
catch { }
}
}
}
}
catch { }
}
db.SaveChanges();
}
static int validate(string n) {
int a = 0;
RssFeedDB db = new RssFeedDB();
var qry = (from m in db.News select m);
foreach (var nws in qry)
{
if (nws.IDurl == n) { a =1; break; }
}
return a;
}
}}
I hope to help me how to put it inside the web application and do his work every minute .
The idea of my project is to fetch data (news) from RSS feeds links and store them in the database to display on my website.
I hope I've shown you very well what I want to do.
A web application is fundamentally unsuited for this type of task. A website's job is to respond to requests from others (usually people, via browsers) as and when they occur. It does not run automatically at specific times.
To achieve your goal, on your server you need either a Windows Service with a timer in it (which then executes the necessary code every time the timer expires), or a Windows Scheduled Task which triggers a specific application at regular intervals.
Since you already have a Console application which does the job you need, then the simplest solution for you is probably to set up a Windows Scheduled Task which executes the console application regularly.
N.B. You can of course still have a separate website which can display the data that the console application has saved in the database.
P.S. There are extensions to ASP.NET such as Hangifire and Quartz.NET which can add scheduling capabilities to a web application, but they are probably overkill for what you want to do in this case.
I have this json string that looks like:
string jsonString = "[ {"id": "1"}, {"id": "2", "category": "toys"} ]";
The quotations are already escaped in the string. And I want to find a nice way to create a link out of json that looks similar to this by converting this to GET parameters. I have only really seen solutions that work well for flat structures.
Edit: I also need tp be able to convert back into the json string.
Initial Answer
You could do something like this to turn it into a link:
using System;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using System.Text;
public class Program
{
public static void Main()
{
dynamic obj = JArray.Parse(jsonString);
var builder = new StringBuilder();
builder.Append("?id0=" + obj[0].id);
builder.Append("&id1=" + obj[1].id);
builder.Append("&category1=" +obj[1].category);
Console.WriteLine("http://www.something.com" + builder.ToString());
}
public static string jsonString = #"[ {""id"": ""1""}, {""id"": ""2"", ""category"": ""toys""} ]";
}
Output:
http://www.something.com?id0=1&id1=2&category1=toys
More Generic Follow Up Answer
Based on your comment, here is something more generic:
using System;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using System.Text;
using System.Collections.Generic;
public class Program
{
public static void Main()
{
JArray array = JArray.Parse(jsonString);
var builder = new StringBuilder();
for (var i = 0; i < array.Count; ++i)
{
JToken obj = array[i];
foreach (JProperty prop in obj)
{
var prefix = i == 0 ? "?" : "&";
builder.AppendFormat("{0}{1}{2}={3}", prefix, prop.Name, i, prop.Value);
}
}
Console.WriteLine("http://www.something.com" + builder.ToString());
}
public static string jsonString = #"[ {""id"": ""1""}, {""id"": ""2"", ""category"": ""toys""} ]";
}
Output:
http://www.something.com?id0=1&id1=2&category1=toys
Okay to answer my own question, this isn't a good way to solve the problem. The json string can be very large and may be truncated or denied if over 2048 characters on the server.
The best approach is to create a jsonHashTable.json that will store the hash of what you are trying to send as the key, and the thing you are sending as a value. Then email the hash/key. Have the controller receiving the hash use the table to look up what data was needed.
I have been asked to look at updating an old ASP.net Web Forms application to ASP.net 4.5; specifically to implement Microsoft's 'User Friendly' routing mechanism (NuGet Package Microsoft.AspNet.FriendlyUrls).
Generally, the upgrade was straightforward, but I am left with one problem.
The original developer attached/associated 'meta data' XML files to many of the web pages.. For example, /Products/Tables/Oak.aspx might also have the following meta file /Products/Tables/Oak.aspx.meta
When the page loads, it 'looks' for the meta file and loads it. In a non-rewritten URL environment, this was easy...
string metaUrl = Page.Request.Url.AbsolutePath + ".meta";
string metaPath = Page.Server.MapPath(metaUrl);
If (System.IO.File.Exists(metaPath)) {
LoadMetaFile(metaPath);
}
In a 'Friendly URL' environment, this is not so easy as the original URL might be rewritten to /Products/Tables/Oak or maybe even rewritten completely via a custom MapPageRoute() definition.
Does anyone know if there a way that I can find/determine the 'true' path of the page?
The solution posted by Petriq ASP.NET WebForms: Request.GetFriendlyUrlFileVirtualPath() returns empty string works perfectly in my scenario.
For reference, here is Petriq's code for his HttpRequest extension method:
using System.Web;
using System.Web.Routing;
using Microsoft.AspNet.FriendlyUrls;
namespace Utils.Extensions
{
public static class HttpRequestExtensions
{
public static string GetFileVirtualPathFromFriendlyUrl(this HttpRequest request)
{
string ret = string.Empty;
ret = request.GetFriendlyUrlFileVirtualPath();
if (ret == string.Empty)
{
foreach (RouteBase r in RouteTable.Routes)
{
if (r.GetType() == typeof(Route))
{
Route route = (Route)r;
// Following line modified for case-insensitive comparison
if (String.Compare("/" + route.Url, request.Path, true) == 0)
{
if (route.RouteHandler.GetType() == typeof(PageRouteHandler))
{
PageRouteHandler handler = (PageRouteHandler)route.RouteHandler;
ret = handler.VirtualPath;
}
break;
}
}
}
}
return ret;
}
}
}
I have a working web service which On load contacts different websites and scrapes relevant information from them. As the requirements grew so did the number of httpwebrequests.
Right now I'm not using any asynchronous requests in the web service - Which means that ASP.net renders one request at a time. This obviously became a burden as one request to the webservice itself can take up to 2 minutes to complete.
Is there a way to convert all these httpwebreqeusts inside the webservice to multi-threaded?
What would be the best way to achieve this?
Thanks!
If you are working with .Net V4+, you can use the Parallel library or task library which allow easily to do such things.
If you call all your web services using the same way (assuming all web services respects the same WSDL, just differing urls, you can use something like this) :
using System;
using System.Collections.Generic;
using System.Linq;
using System.Net;
using System.Text.RegularExpressions;
namespace ConsoleApplication2
{
class Program
{
private const string StartUrl = #"http://blog.hand-net.com";
private static void Main()
{
var content = DownloadAsString(StartUrl);
// The "AsParallel" here is the key
var result = ExtractUrls(content).AsParallel().Select(
link =>
{
Console.WriteLine("... Fetching {0} started", link);
var req = WebRequest.CreateDefault(new Uri(link));
var resp = req.GetResponse();
var info = new { Link = link, Size = resp.ContentLength};
resp.Close();
return info;
}
);
foreach (var linkInfo in result)
{
Console.WriteLine("Link : {0}", linkInfo.Link);
Console.WriteLine("Size : {0}", linkInfo.Size);
}
}
private static string DownloadAsString(string url)
{
using (var wc = new WebClient())
{
return wc.DownloadString(url);
}
}
private static IEnumerable<string> ExtractUrls(string content)
{
var regEx = new Regex(#"<a\s+href=""(?<url>.*?)""");
var matches = regEx.Matches(content);
return matches.Cast<Match>().Select(m => m.Groups["url"].Value);
}
}
}
This small program first download an html page, then extract all href. This produces an array of remote files.
the AsParralel here allow to run the content of the select in a parallel way.
This code does not have error handling, cancellation feature but illustrate the AsParallel method.
If you can't call all your webservices in the same way, you can also use something like this :
Task.WaitAll(
Task.Factory.StartNew(()=>GetDataFromWebServiceA()),
Task.Factory.StartNew(()=>GetDataFromWebServiceB()),
Task.Factory.StartNew(()=>GetDataFromWebServiceC()),
Task.Factory.StartNew(()=>GetDataFromWebServiceD()),
);
This code will add 4 tasks, that will be run "when possible". The WaitAll method will simply wait for all task to be completed before returning.
By when possible I mean when a slot in the thread pool is free. When using the Task library, there is by default one thread pool per processor core. If you have 100 tasks, the 100 taks will be processed by 4 worker threads on a 4 core computer.