Where is the rendered response persisted in Phoenix? - functional-programming

defmodule TwittEx.PageController do
require IEx
use TwittEx.Web, :controller
def index(conn, _params) do
text conn, "hello"
"dummy string so text/2 isn't returned"
end
end
I come from Rails and understand what controllers are supposed to do, but can't understand how the magic happens in Phoenix. Elixir is stateless, so conn and _params cannot be modified. How does text conn, "hello" remember "hello" as I don't return it from index.
I suspect there is another process that keeps track of response. If so, does that mean a single vanilla request has more than one process.

How does text conn, "hello" remember "hello" as I don't return it from index.
It doesn't! The response (in this case "hello") is written to the conn immediately after you call text. You can test this out using the following:
def index(conn, _params) do
:timer.sleep(1000)
conn = text conn, "hello"
:timer.sleep(5000)
conn
end
If you visit this page, you'll see the response in just over 1 second, not 6 seconds.
If you try to call text on the new conn returned by text, you'll get a Plug.Conn.AlreadySentError error as the new struct's state is :sent. If you try to call it twice on the same conn though, there's no error and the second write is ignored. I'm pretty sure it is the responsibility of the Plug Handler to ignore the second write as I just verified that Plug just calls the handler's send_resp again if you do that.

Related

F#: can I extend a module with additional module by just referencing another assembly?

I'm working on a tiny F# ADO.NET "wrapper" (yes, yet another one, besides Zaid Ajaj's Npgsql.FSharp, Pim Brouwers's Donald and many others on GitHub), and I am thinking about extending the support for different ADO.NET providers...
Basically I have a core project (ie. Michelle.Sql.Core) that contains the core types + functions, a bit similar to Dapper:
type IDbValue<'DbConnection, 'DbParameter
when 'DbConnection :> DbConnection
and 'DbParameter :> DbParameter> =
abstract ToParameter: string -> 'DbParameter
type CommandDefinition<'DbConnection, 'DbParameter, 'DbType
when 'DbConnection :> DbConnection
and 'DbParameter :> DbParameter
and 'DbType :> IDbValue<'DbConnection, 'DbParameter>> =
{ Statement: Statement
Parameters: (string * 'DbType) list
CancellationToken: CancellationToken
Timeout: TimeSpan
StoredProcedure: bool
Prepare: bool
Transaction: DbTransaction option }
First thing, you might think "Wosh there is a lot generics ornamenting your type definitions!".
Alright so first things first I'm trying to work around some limitations, most notable this one: https://github.com/fsharp/fslang-suggestions/issues/255 (along with its good friend), thought I could circumvent that issue by creating a C# project and forcing constraints in that project, it doesn't work out.
The reason I need that many generic constraints is that I want a strongly-typed connection that kinda "flow" through the calls setting the values of the different fields of that record, for example:
let playWithSQLite() =
use connection = new SQLiteConnection()
Sql.statement "INSERT INTO aTable (aColumn) VALUES(#aNumber);"
|> Sql.prepare true
|> Sql.timeout (TimeSpan.FromMinutes(1.))
|> Sql.parameters [("aNumber", SqliteDbValue.Integer 42L)]
|> Sql.executeNonQuery connection
Fyi, SqliteDbValue is defined in different assembly Michelle.Sql.Sqlite:
// https://www.sqlite.org/datatype3.html
type SqliteDbValue =
| Null
| Integer of int64
| Real of double
| Text of string
| Blob of byte array
interface IDbValue<SQLiteConnection, SQLiteParameter> with
member this.ToParameter(name) =
let parameter = SQLiteParameter()
// Not so secret impl. goes here...
parameter
The code above works, basically, the CommandDefinition record is populated via different calls defined in the core library through a Sql module (decorated with RequiredAccessAttribute).
The problem arise when the use needs to explicitly indicates the generic return type...
[<RequireQualifiedAccess>]
module Sql =
// [...]
let executeNonQuery
(connection: 'DbConnection when 'DbConnection :> DbConnection)
(commandDefinition: CommandDefinition<'DbConnection, 'DbParameter, 'DbType>
when 'DbConnection :> DbConnection
and 'DbParameter :> DbParameter
and 'DbType :> IDbValue<'DbConnection, 'DbParameter>) =
async {
// Not so secret impl. goes here
}
let executeScalar<'Scalar, .. >
(connection: 'DbConnection when 'DbConnection :> DbConnection)
(commandDefinition: CommandDefinition<'DbConnection, 'DbParameter, 'DbType>
when 'DbConnection :> DbConnection
and 'DbParameter :> DbParameter
and 'DbType :> IDbValue<'DbConnection, 'DbParameter>) =
async {
// Not so secret impl. goes here
}
So you see, in the case of the executeScalar function above, since one type has to be made explicit, it means every other generic parameter has now to be made explicit when calling that function, otherwise they are defaulted to obj, which among other things means that the end-user now needs to input 4 generic parameters:
// [...] setting up the CommandDefinition...
|> Sql.executeScalar<int64, SQLiteConnection, SQLiteParameter, SqliteDbValue> connection
and this is exactly the kind of things I would like to avoid while retaining the connection consistency.
What I tried and which is rather a clunky solution is to implement a reduced version of the executeScalar, and what I mean by:
module Michelle.Sql.Sqlite
[<RequireQualifiedAccess>]
module Sql =
let executeScalar<'Scalar> connection commandDefinition =
Sql.executeScalar<'Scalar, SQLiteConnection, SQLiteParameter, SqliteDbValue>
connection
commandDefinition
But the thing with strategy is that it essentially boils down to shadowing:
Hence this code below doesn't work:
open Michelle.Sql.Sqlite
open Michelle.Sql.Core
// [...] setting up the CommandDefinition... connection being an instance of SQLiteConnection
|> Sql.executeScalar<int64> connection
While that one does:
open Michelle.Sql.Core
open Michelle.Sql.Sqlite
// [...] setting up the CommandDefinition... connection being an instance of SQLiteConnection
|> Sql.executeScalar<int64> connection
I wish there could be a solution, I even though about static classes, but partial classes can't be defined across several assemblies.
I know that overloading is not possible with F# module functions and shadowing doesn't look like a viable solution in terms of developer experience.
So, is there any solution out there? (Putting aside creating another function with a different name or a different module with also a different name)
Koenig Lear suggested:
Not really an answer but why don't you rename your original module to CoreSQL then you can create modules for each driver type e.g. Sql.execluteScalar<'T> = CoreSql.executeScalar<'T,SQLLiteConnection, etc. and provide aliasing to every single function and never expose CoreSQL.
and this is pretty much what I ended up doing:
// open stuff goes here...
type SqliteCommandDefinition = CommandDefinition<SQLiteConnection, SQLiteParameter, SqliteDbValue>
[<RequireQualifiedAccess>]
module Sqlite =
// Other functions irrelevant to this post
let executeScalar<'Scalar> connection (commandDefinition: SqliteCommandDefinition) =
Sql.executeScalar<'Scalar, _, _, _>
connection
commandDefinition
let executeNonQuery connection (commandDefinition: SqliteCommandDefinition) =
Sql.executeNonQuery connection commandDefinition

HashDict and OTP GenServer context within Elixir

I am having trouble using the HashDict function within OTP. I would like to use one GenServer process to put and a different one to fetch. When I try and implement this, I can put and fetch items from the HashDict when calling from the same GenServer; it works perfectly (MyServerA in the example below). But when I use one GenServer to put and a different one to fetch, the fetch implementation does not work. Why is this? Presumably it's because I need to pass the HashDict data structure around between the three different processes?
Code example below:
I use a simple call to send some state to MyServerB:
MyServerA.add_update(state)
For MyServerB I have implemented the HashDict as follows:
defmodule MyServerB do
use GenServer
def start_link do
GenServer.start_link(__MODULE__,[], name: __MODULE__)
end
def init([]) do
#Initialise HashDict to store state
d = HashDict.new
{:ok, d}
end
#Client API
def add_update(update) do
GenServer.cast __MODULE__, {:add, update}
end
def get_state(window) do
GenServer.call __MODULE__, {:get, key}
end
# Server APIs
def handle_cast({:add, update}, dict) do
%{key: key} = update
dict = HashDict.put(dict, key, some_Value)
{:noreply, dict}
end
def handle_call({:get, some_key}, _from, dict) do
value = HashDict.fetch!(dict, some_key)
{:reply, value, dict}
end
end
So if from another process I use MyServerB.get_state(dict,some_key), I don't seem to be able to return the contents of the HashDict...
UPDATE:
So if I use ETS I have something like this:
def init do
ets = :ets.new(:my_table,[:ordered_set, :named_table])
{:ok, ets}
end
def handle_cast({:add, update}, state) do
update = :ets.insert(:my_table, {key, value})
{:noreply, ups}
end
def handle_call({:get, some_key}, _from, state) do
sum = :ets.foldl(fn({{key},{value}}, acc)
when key == some_Key -> value + acc
(_, acc) ->
acc
end, 0, :my_table)
{:reply, sum, state}
end
So again, the cast works - when I check with observer I can see the its filling up with my key value pairs. However, when I try my call it returns nothing again. So I'm wondering if I'm handling the state incorrectly?? Any help, gratefully received??
Thanks
Your problem is with this statement:
I would like to use one GenServer process to put and a different one to fetch.
In Elixir processes cannot share state. So you cannot have one process with data, and another process reading it directly. You could for example, store the HashDict in one process and then have the other process send a message to the first asking for data. That would make it appear as you describe, however behind the scenes it would still have all transactions go through the first process. There are techniques for doing this in a distributed/concurrent fashion so that multiple cores are utilize but that may be more work than you're looking to do at the moment.
Take a look at ETS, which will allow you to create a public table and access the data from multiple processes.
ETS is the way to go. Share a HashDict as state between GenServers is not possible.
I really don't know how you are testing your code, but ETS by default has read and write concurrency to false. For example, if you have no problem with reading or writing concurrently then you can change your init function to:
def init do
ets = :ets.new :my_table, [:ordered_set, :named_table,
read_concurrency: true,
write_concurrency: true]
{:ok, ets}
end
Hope this helps.

Fire and forget entry/accept mechanism in Ada

Is there a pattern for a fire and forget mechanism in Ada? When I call a task entry, I don't want the caller to be blocked until the message has been processed. I would like the task to be asynchronous. What I've tried is
loop
select
accept xxx(params) do
-- save the parameters in a queue
end accept;
...
else
-- pick the next item off the queue and process it
end select;
end loop;
It looks like a clumsy mechanism. Maybe fire and forget is the wrong term. I've also tried one task filling up the queue and another taking entries off the queue. Is there a better way of implementing asynchronous tasks in Ada.
If you’re using Ada 2012, the way to go would be to use Ada.Containers.Unbounded_Synchronized_Queues (or the Bounded version): your user code calls Enqueue, your server task calls Dequeue which blocks if the queue is empty.
If not, the normal approach would be to use your own protected object to encapsulate a queue (which is how the Ada 2012 packages do it). Something like
package Parameters is new Ada.Containers.Vectors (Positive, Parameter);
protected Queue is
procedure Put (P : Parameter);
entry Get (P : out Parameter);
private
The_Queue : Parameters.Vector;
end Queue;
protected body Queue is
procedure Put (P : Parameter) is
begin
The_Queue.Append (P);
end Put;
entry Get (P : out Parameter) when not The_Queue.Is_Empty is
begin
P := The_Queue.First_Element;
The_Queue.Delete_First;
end Get;
end Queue;
and then
task body Server is
P : Parameter;
begin
loop
Queue.Get (P);
-- process P
end loop;
end Server;

TCP send command and wait for output

I have the following situation:
function Mach3Code(Str: String): String;
var StrOut: String;
begin
StrOut := '';
try
IdTelnet1.Connect();
IdTelnet1.Write(Str);
StrOut := ''; // assign here return output;
finally
IdTelnet1.Disconnect;
end;
Result := StrOut;
end;
On the line "StrOut := '';" I need to get the text output of the server (which is a tcp server, written in vc 2008 by me as Mach3 plugin).
Normally, the client sends "COMMAND1" and the server replies with "ANSWER1#" or something like this. I need the code to wait for the answer and then return it, synchronously, so I can do something like:
StrResult := Mach3Code('G0X300Y200');
and read what the server part has sent to me.
any ideas how I can solve this problem?
TIdTelnet is an asynchronous componnt, it is not suited for what you are attempting to do. Unless you are dealing with the actual Telnet protocol, then you should use TIdTCPClient instead:
function Mach3Code(const Str: String): String;
begin
Result := '';
try
IdTCPClient1.Connect();
IdTCPClient1.WriteLn(Str);
StrOut := IdTCPClient1.ReadLn('#');
finally
IdTCPClient1.Disconnect;
end;
end;
to receive data assign an event handler of type TIdTelnetDataAvailEvent to the OnDataAvailable property of idtelnet1, i know this is not synchronous but i would re factor your code to work this way as this is how the telnet client is designed to work.
Failing that create your own TIdTCPClientCustom decendant and implement your own read thread with the relevent methods.

F# - The type was expected to have type Async<'a> but has string -> Asnyc<'a> instead

After shamelessly pilfering a code snippet from Tomas Petricek's Blog:
http://tomasp.net/blog/csharp-fsharp-async-intro.aspx
Specifically, this one (and making a few alterations to it):
let downloadPage(url:string) (postData:string) = async {
let request = HttpWebRequest.Create(url)
// Asynchronously get response and dispose it when we're done
use! response = request.AsyncGetResponse()
use stream = response.GetResponseStream()
let temp = new MemoryStream()
let buffer = Array.zeroCreate 4096
// Loop that downloads page into a buffer (could use 'while'
// but recursion is more typical for functional language)
let rec download() = async {
let! count = stream.AsyncRead(buffer, 0, buffer.Length)
do! temp.AsyncWrite(buffer, 0, count)
if count > 0 then return! download() }
// Start the download asynchronously and handle results
do! download()
temp.Seek(0L, SeekOrigin.Begin) |> ignore
let html = (new StreamReader(temp)).ReadToEnd()
return html };;
I tried to do the following with it, and got the error on the last line:
The type was expected to have type Async<'a> but has string -> Asnyc<'a> instead
I googled the error but couldn't find anything that revealed my particular issue.
let postData = "userid=" + userId + "&password=" + password + "&source=" + sourceId + "&version=" + version
let url = postUrlBase + "100/LogIn?" + postData
Async.RunSynchronously (downloadPage(url, postData));;
Also, how would I modify the code so that it downloads a non-ending byte stream (but with occasional pauses between each burst of bytes) asynchronously instead of a string? How would I integrate reading this byte stream as it comes through? I realize this is more than one question, but since they are are all closely related I figured one question would save some time.
Thanks in advance,
Bob
P.S. As I am still new to F# please feel free to make any alterations/suggestions to my code which shows how its done in a more functional style. I'm really trying to get out of my C# mindset, so I appreciate any pointers anyone may wish to share.
Edit: I accidentally pasted in the wrong snippet I was using. I did make an alteration to Tomas' snippet and forgot about it.
When I attempt to run your code downloadPage(url, postData) doesn't work as downloadPage expects two seperate strings. downloadPage url postData is what is expected.
If you changed the let binding to tuple form, or let downloadPage(url:string, postData:string) your call would have worked as well.
To explain why you got the error you got is more complicated. Curried form creates a function that returns a function or string -> string -> Async<string> in your case. The compiler therefore saw you passing a single parameter (tuples are single items after all) and saw that the result would have to be a string -> Async<string> which is not compatible with Async<string>. Another error it could have found (and did in my case) is that string * string is not compatible with string. The exact error being Expected string but found 'a * 'b.
This is what I had:
Async.RunSynchronously (downloadPage(url, postData));;
this is what worked after continued random guessing:
Async.RunSynchronously (downloadPage url postData);;
Although, I'm not sure why this change fixed the problem. Thoughts?

Resources