I am trying to convert the following variable:
- final "in1.txt";
val it = [|[#"S",#".",#".",#"."],[#".",#".",#".",#"."],[#"W",#".",#"X",#"W"],
[#".",#".",#"X",#"E"]|] : char list array
from 'char list array' to 'char array array' in SMLNJ. The only reason I want to do this is because I need to be able to randomly iterate through this data, to perform a Dijkstra-like algorithm for a school project (if there 's a more efficient way to make this data iteratable, I am all ears). Is there a way to do this? The function that reads the input file and returns the above is this (I found it in Stack Overflow):
fun linelist file =
let
open Char
open String
open List
val instr = TextIO.openIn file
val str = TextIO.inputAll instr
in
tokens isSpace str
before
TextIO.closeIn instr
end
fun final file =
let
fun getsudo file = map explode (linelist file)
in
Array.fromList (getsudo file)
end
and the input files that need to be processed are like the one that follows:
S...
....
W.XW
..XE
You might want to try a different way to read this (space delivery) map (to help Lakis -- yes I am a classmate of yours).
fun parse file =
let
fun next_String input = (TextIO.inputAll input)
val stream = TextIO.openIn file
val a = next_String stream
val lista = explode(a)
in
lista
end
Parse is a function that gets all the contents from a text file and saves them in string a. Then, the function explode (function of String Signature of the SML NJ) creates a list, called lista. The elements of the list are the characters of the string a in the same order.
Then, you can create another function that saves the contents of the list to an array. Each row of the array will contain the characters of the list until #"\n" comes up.
Related
I would like to collect user input (a list of number separated by a space) split it in to an array and transform the data in it from string to float.
Basically i want to recreate this python code in Julia:
userlist = input('[+]type in a list of number separated by a space: ').split()
for i in range(len(userlist)): userlist[i] = float(userlist[i])
i tried this but didn't work:
print("type in a list of number separated by a space: ")
userinput = readline()
userlist = rsplit(userinput, " ")
for i in 0:length(userlist)
userlist[i] = userlist[i]::Float64
end
You're close. You parse a String into Float64 with parse, not a type assertion.
userlist[i] = parse(Float64, userlist[i])
This still won't quite work, since userlist is an array of strings and can't store floats (arrays in Julia are stored with their type by default, for efficiency). You could make a new array and then do the for loop like you have been, but you can also just use map.
userlist = map(x -> parse(Float64, x), userlist)
I would recommend using DelimitedFiles as it is usually more robust (you always end up with user inputting some wrong data etc.:
readdlm(IOBuffer(readline()))
For an example:
julia> readdlm(IOBuffer(readline()))
1 2 3
1×3 Matrix{Float64}:
1.0 2.0 3.0
Hi newbie here and I am trying to master recursive functions in Erlang. This function looks like it should work, but I cannot understand why it does not. I am trying to create a function that will take N and a string and will print out to stdout the string the number of times.
My code:
-module(print_out_n_times).
-export([print_it/2).
print_it(0, _) ->
"";
print_it(N, string) ->
io:fwrite(string),
print_it(N - 1, string).
The error I get is:
** exception error: no function clause matching print_it(5, "hello')
How can I make this work ?
Variables in Erlang start with a capital letter. string is an atom, not a variable named "string". When you define a function print_it(N, string), it can be called with any value for the first argument and only the atom string as the second. Your code should work if you replace string with String:
print_it(N, String) ->
io:fwrite(String),
print_it(N - 1, String).
I'm trying to take a large file and split it into many smaller files. The location where each split occurs is based on a predicate returned from examining the contents of each given line (isNextObject function).
I have attempted to read in the large file via the File.ReadLines function so that I can iterate through the file one line at a time without having to hold the entire file in memory. My approach was to group the sequence into a sequence of smaller sub-sequences (one per file to be written out).
I found a useful function that Tomas Petricek created on fssnip called groupWhen. This function worked great for my initial testing on a small subset of the file, but a StackoverflowException is thrown when using the real file. I am not sure how to adjust the groupWhen function to prevent this (I'm still an F# greenie).
Here is a simplified version of the code showing only the relevant parts that will recreate the StackoverflowExcpetion::
// This is the function created by Tomas Petricek where the StackoverflowExcpetion is occuring
module Seq =
/// Iterates over elements of the input sequence and groups adjacent elements.
/// A new group is started when the specified predicate holds about the element
/// of the sequence (and at the beginning of the iteration).
///
/// For example:
/// Seq.groupWhen isOdd [3;3;2;4;1;2] = seq [[3]; [3; 2; 4]; [1; 2]]
let groupWhen f (input:seq<_>) = seq {
use en = input.GetEnumerator()
let running = ref true
// Generate a group starting with the current element. Stops generating
// when it founds element such that 'f en.Current' is 'true'
let rec group() =
[ yield en.Current
if en.MoveNext() then
if not (f en.Current) then yield! group() // *** Exception occurs here ***
else running := false ]
if en.MoveNext() then
// While there are still elements, start a new group
while running.Value do
yield group() |> Seq.ofList }
This is the gist of the code making use Tomas' function:
module Extractor =
open System
open System.IO
open Microsoft.FSharp.Reflection
// ... elided a few functions include "isNextObject" which is
// a string -> bool (examines the line and returns true
// if the string meets the criteria to that we are at the
// start of the next inner file)
let writeFile outputDir file =
// ... write out "file" to the file system
// NOTE: file is a seq<string>
let writeFiles outputDir (files : seq<seq<_>>) =
files
|> Seq.iter (fun file -> writeFile outputDir file)
And here is the relevant code in the console application that makes use of the functions:
let lines = inputFile |> File.ReadLines
writeFiles outputDir (lines |> Seq.groupWhen isNextObject)
Any ideas on the proper way to stop groupWhen from blowing the stack? I'm not sure how I would convert the function to use an accumulator (or to use a continuation instead, which I think is the correct terminology).
The problem with this is that the group() function returns a list, which is an eagerly evaluated data structure, which means that every time you call group() it has to run to the end, collect all results in a list, and return the list. This means that the recursive call happens within that same evaluation - i.e. truly recursively, - thus creating stack pressure.
To mitigate this problem, you could just replace the list with a lazy sequence:
let rec group() = seq {
yield en.Current
if en.MoveNext() then
if not (f en.Current) then yield! group()
else running := false }
However, I would consider less drastic approaches. This example is a good illustration of why you should avoid doing recursion yourself and resort to ready-made folds instead.
For example, judging by your description, it seems that Seq.windowed may work for you.
It's easy to overuse sequences in F#, IMO. You can accidentally get stack overflows, plus they are slow.
So (not actually answering your question),
personally I would just fold over the seq of lines using something like this:
let isNextObject line =
line = "---"
type State = {
fileIndex : int
filename: string
writer: System.IO.TextWriter
}
let makeFilename index =
sprintf "File%i" index
let closeFile (state:State) =
//state.writer.Close() // would use this in real code
state.writer.WriteLine("=== Closing {0} ===",state.filename)
let createFile index =
let newFilename = makeFilename index
let newWriter = System.Console.Out // dummy
newWriter.WriteLine("=== Creating {0} ===",newFilename)
// create new state with new writer
{fileIndex=index + 1; writer = newWriter; filename=newFilename }
let writeLine (state:State) line =
if isNextObject line then
/// finish old file here
closeFile state
/// create new file here and return updated state
createFile state.fileIndex
else
//write the line to the current file
state.writer.WriteLine(line)
// return the unchanged state
state
let processLines (lines: string seq) =
//setup
let initialState = createFile 1
// process the file
let finalState = lines |> Seq.fold writeLine initialState
// tidy up
closeFile finalState
(Obviously a real version would use files rather than the console)
Yes, it is crude, but it is easy to reason about, with
no unpleasant surprises.
Here's a test:
processLines [
"a"; "b"
"---";"c"; "d"
"---";"e"; "f"
]
And here's what the output looks like:
=== Creating File1 ===
a
b
=== Closing File1 ===
=== Creating File2 ===
c
d
=== Closing File2 ===
=== Creating File3 ===
e
f
=== Closing File3 ===
Is there a way, using the SML Basis library, to open a file at a specific position? That is, use an operating system call to change the position, rather than scan through the file and throw away the data.
This is tricky. Unfortunately, seeking isn't directly supported. Moreover, file positions are only transparent for binary files, i.e., those that you have opened with the BinIO structure [1]. For this structure, the corresponding type BinIO.StreamIO.pos is defined to be Position.int, which is some integer type.
However, in an SML system that supports the complete I/O stack from the standard you should be able to synthesise the following seek function using the lower I/O layers:
(* seekIn : BinIO.instream * Position.int -> unit *)
fun seekIn(instream, pos) =
case BinIO.StreamIO.getReader(BinIO.getInstream instream) of
(reader as BinPrimIO.RD{setPos = SOME f, ...}, _) =>
( f pos;
BinIO.setInstream(instream,
BinIO.StreamIO.mkInstream(reader, Word8Vector.fromList[]))
)
| (BinPrimIO.RD{name, ...}, _) =>
raise IO.Io{
name = name,
function = "seekIn",
cause = IO.RandomAccessNotSupported
}
Use it like:
val file = BinIO.openIn "filename"
val _ = seekIn(file, 200)
val bin = BinIO.inputN(file, 1000)
If you need to convert from Word8Vector to string:
val s = Byte.bytesToString bin
You can do the equivalent for out streams as well.
[1] http://standardml.org/Basis/bin-io.html#BIN_IO:SIG:SPEC
If you can manage to get hold of the reader/writer, then they should have getPos, setPos and endPos functions, depending on which kind of reader/writer you are dealing with.
I'm trying to read text from a file in SML. Eventually, I want a list of individual words; however, I'm struggling at how to convert between a TextIO.elem to a string. For example, if I write the following code it returns a TextIO.elem but I don't know how to convert it to a string so that I can concat it with another string
TextIO.input1 inStream
TextIO.elem is just a synonym for char, so you can use the str function to convert it to a string. But as I replied to elsewhere, I suggest using TextIO.inputAll to get a string right away.
Here is a function that takes an instream and delivers all (remaining) words in it:
val words = String.tokens Char.isSpace o TextIO.inputAll
The type of this function is TextIO.instream -> string list.