RSA procedure cryptology - encryption

Hi i was wondering if anyone can help me with the following procedure using maple
The protocol of the rsa decryption/encryption method is below and the question i am trying to attempt after that with my attempt below it, i will appreciate any help. Thanks
my current attempt is the following
rsa := proc (key::rsakey, msg::(list(rsascii)))
local ct, pe, pm, i;
pm := 26271227347;
pe := key[2];
ct := [];
for i to nops(msg) do ct := [op(ct), `mod`(message[i]^pe, pm)]
end do;
RETURN(ct)
end proc;
this was from using the maple website

You've been given n and e, and you'll need to find the corresponding d before you can decode. I don't understand why you were trying to use e to decode.
Your procedure contains at least one type, using message[i] instead of msg[i]. See also comments in the code below.
After finding d I get that the encoded integer 11393739244 decodes to the integer 87, which corresponds to the ASCII character "W" (not "wha" or "Wha" as you suggested).
I don't understand what you intend on doing about block size, and so I've had to guess. Below I show encoding/decoding done either A) a character at a time, or B) using three characters at once. I trust you realize that encoding one-character-at-a-time isn't a great idea. Also, in a duplicate post in another forum you wrote that you don't care about security against attack. (You also wrote there that this isn't homework, but here it looks like it more, IMO.)
If you had trouble writing and using your rsa procedure then you may find the various splitting/concetenating/padding operations tough also.
You wrote in a comment that when you tried to use your initial attempt at procedure rsa then, "it doesn't give anything back". If it returned as an unevaluated call then perhaps your attempt at creating the proc and assigning it didn't actually work. If you have trouble using Maple's default 2D Input mode in a Document then consider switching your preferences to 1D Maple Notation input in a Worksheet. Those are two Preferences for Maple's Standard Java GUI.
NB. I use Maple's numtheory[lambda] command to find "the smallest integer i such that for all g coprime to n, g^i is congruent to 1 modulo n". In recent Maple versions this is also avaliable as the command NumberThoery:-CarmichaelLambda. See also here.
restart;
# The procedure `rsa` below can be used to both encode or
# decode an integer.
#
# Conversion from/to ASCII is done separately, before/after.
rsa := proc(key::list(posint), msg::list(posint))
local ct, pe, pm, i;
pm := key[1];
pe := key[2];
## The original used `message` instead of `msg`, which was
## a careless typo. But iterated list concatenation like this
## is inefficient. Better to just use `seq`, as below.
## Also, use inert `&^` instead of `^` in the call to `mod`
## since the latter inefficiently computes the power
## explicitly (before taking the modulus).
#ct := [];
# for i to nops(msg) do ct := [op(ct), `mod`(msg[i] &^ pe, pm)]
#end do;
ct := map(u->`mod`(u &^ pe, pm), msg);
return ct;
end proc:
# You supplied (n,e) and you'll need to find d in order to decode.
n:=26271227347;
n := 26271227347
L := numtheory[lambda](n);
L := 13135445468
e:=11546465;
e := 11546465
evalb( e < L ); # a requirement
true
evalb( gcd(e, L) = 1); # a requirement
true
d := 1/e mod L;
d := 7567915453
# Now decode the number you supplied.
res := rsa([n,d],[11393739244]);
res := [87]
with(StringTools):
# So what ASCII character is that?
convert(res,bytes);
"W"
s := "Wha":
sb := map(convert,convert(s,bytes),string);
sb := ["87", "104", "97"]
sbn := map(parse,sb);
sbn := [87, 104, 97]
encoded := rsa([n,e],sbn);
encoded := [11393739244, 9911682959, 21087186892]
decoded := rsa([n,d],encoded);
decoded := [87, 104, 97]
pad := proc(str::string)
local r;
r := irem(length(str),3);
cat(seq("0",i=1..`if`(r=0,0,3-r)), str);
end proc:
map(pad, map(convert,decoded,string));
["087", "104", "097"]
cat(op(map(u->convert(map(parse,[LengthSplit(convert(u,string),3)]),
bytes), %)));
"Wha"
newsb := [cat(op(map(SubstituteAll,map(PadLeft,sb,3)," ","0")))];
newsb := ["087104097"]
newsbn := map(parse,newsb);
newsbn := [87104097]
encoded := rsa([n,e],newsbn);
encoded := [15987098394]
decoded := rsa([n,d],%);
decoded := [87104097]
map(pad, map(convert,decoded,string));
["087104097"]
cat(op(map(u->convert(map(parse,[LengthSplit(convert(u,string),3)]),
bytes), %)));
"Wha"

Related

How to count number of occurrences of a character in an array in Pascal

I have to script a pascal code that rations into calculation the frequency of a character's appearance in the code and displays it through the output mode
Input P2 changes:
Second Attempt at the coding phase
I tried revisioning the code.I added the output variable writeln('input array of characters'); & writeln('Number of Occurrences',k);, which should help me output how many times did the S character appear overall in the code, plus utilised the for & if commands to have the final values showcased based on the conditions, if the frequency is 1 then count in S, still getting errors, take a look at the Input P2 & Output P2
Input P1
function Count(t, s: String): Integer;
var
Offset, P: Integer;
begin
Result := 0;
Offset := 1;
P := PosEx(t, s, Offset);
while P > 0 do
begin
Inc(Result);
P := PosEx(t, s, P + 1);
end;
end;
Output P2
Target OS: Linux for x86-64
Compiling main.pas
main.pas(5,3) Error: Identifier not found "Result"
main.pas(7,8) Error: Identifier not found "PosEx"
main.pas(8,3) Error: Identifier not found "unsigned"
main.pas(8,12) Fatal: Syntax error, ";" expected but "identifier N" found
Fatal: Compilation aborted
Error: /usr/bin/ppcx64 returned an error exitcode
-------------------------------------------------------------------
Input P2
program p1
var S:string
i:integer
begin
writeln('input array of characters');
k:=O;
for i:=1 to length (S) do
if (S[i])='m') and (S[i+1]='a') then k:=k+1;
writeln('Number of Occurrences',k);
Readln;
end.
Output P2
Compiling main.pas
main.pas(2,1) Fatal: Syntax error, ";" expected but "VAR" found
Fatal: Compilation aborted
Error: /usr/bin/ppcx64 returned an error exitcode
The errors you see in the first block:
Identifier not found "Result"
Standard Pascal doesn't recognize the pseudovariable Result. In some Pascal implementations (like e.g. Delphi) it can be used to assign a value to the function result. The Pascal you are using needs to have the result of a function assigned to the name of the function. For example:
function Whatever(): integer;
begin
Whatever := 234;
end;
Identifier not found "PosEx"
Not all Pascal implementations include the PosEx() function. You need to use Pos() instead. But, the standard implementation of Pos() doesn't include the "search start position" that PosEx has. Therefore you need to ditch Pos() and do as you do in "Input P2", that is traverse the text character per character and count the occurances as you go.
Identifier not found "unsigned"
Seems you have removed that unknown identifier.
The error you see in the second block:
In Output P2 the error message should be clear. You are missing a semicolon where one is needed. Actually you are missing three of them.
You are also missing the line that reads user input: ReadLn(S);.
Finally, to calculate both upper and lower case characters you can use an extra string variable, say SU: string to which you assign SU := UpperCase(S) after reading user input, and then use that string to count the occurances.
I think this is more like what you want to do:
function Count(t, s: String): Integer;
var
Offset,Res, P: Integer;
begin
Res := 0
Offset := 1;
repeat
P := Pos(t, s, Offset);
if p>0 then
Inc(Res);
Offset := P+1
untl P = 0;
Count := Res;
end;
Now, if you don't have Pos, you can implement it:
Function Pos(const t,s:string; const Start:integer):Integer;
Var
LS, LT, {Length}
IxS, IxT, {Index)
R: Integer; {Result}
begin
R := 0;
{use only one of the two following lines of code}
{if your compiler has length}
LS := length(S); LT := Length(T);
{If it does not}
LS := Ord(s[0]); LT := Ord(T[0]);
if (LS <= LT) {if target is larger than search string, it's not there}
and (Start<=LT) and {same if starting position beyond size of S}
(Start+LT <-LS) then {same if search would go beyond size of S}
begin {Otherwise, start the search}
ixT := 1;
ixS := Start;
repeat
Inc(R); {or R:= R+1; if INC not available }
If (S[ixS] <> T[ixT]) then
R := 0 {they don't match, we're done}
else
begin {Move to next char}
Inc(ixS);
Inc(ixT);
end;
until (R=0) or (ixT>LT); {if search failed or end of target, done}
Pos := R;
end;

go's big.Int underlying value mutates when value points to another instance

I have come across a bit of unexplained behavior of go's big.Int when pointing an instance of one big.Int to another.
I am knowledgeable that in order to set a value of a bit.Int's instance to another, one must use the Int.SetXXX setters, because they actually cause the underlying abs slice in big.Int to be copied to a newly allocated array. However, putting that aside for a moment, I'd like to know why the following behavior occurs.
Consider the following:
Wrong example (underlying value mutates):
func main() {
v1p := big.NewInt(1)
v2p := big.NewInt(2)
v1 := *v1p
v2 := *v2p
v2 = v1
v1.SetInt64(3)
fmt.Println(v1.Int64(), v2.Int64())
}
(run here: https://play.golang.org/p/WxAbmGdKG9b)
Correct example (value does not mutate):
func main() {
v1p := big.NewInt(1)
v2p := big.NewInt(2)
v1 := *v1p
v2 := *v2p
v2.Set(v1p)
v1.SetInt64(3)
fmt.Println(v1.Int64(), v2.Int64())
}
(run here: https://play.golang.org/p/16qsGhwHIWf)
If I understand correctly, the following should essentially demonstrate what happens in the wrong example:
func main() {
var a, b *int // analogous to the 2 big.Int pointers returned
c, d := 3, 3
a = &c // we set the pointers to point to something we can then dereference
b = &d
e := *a // e and f should now point to the values pointed to by the pointers
f := *b
// the rest is self-explanatory
e = f
c = 5
d = 4
fmt.Println(a, b, c, d, e, f)
}
(run here: https://play.golang.org/p/cx76bnmJhG7)
My only assumption is that somehow when copying the struct content onto v2 in the Wrong example, what happens is that abs slice does not get deep-copied but that the storage that it references is actually the same storage that the slice in v1 points to.
Is this really what happens? Is this the expected behavior according to the language spec too?
As pointed out by icza and Volker, since when dereferencing the big.Int pointer the actual value slice header struct is copied, pointing to the same underlying value, the resulting behavior is that the same underlying array gets referenced from multiple slices, resulting in one altering the other.

In Lua, how to insert numbers as 32 bits to front of a binary sequence?

I'm new to Lua when I began to use OpenResty, I want to output a image and it's x,y coordinate together as one binary sequence to the clients, looks like: x_int32_bits y_int32_bits image_raw_data. At the client, I know the first 32 bits is x, the second 32 is y, and the others are image raw data. I met some questions:
How to convert number to 32 binary bits in Lua?
How to merge two 32 bits to one 64 bits sequence?
How to insert 64 bits to front of image raw data? And how to be fastest?
file:read("*a") got string type result, is the result ASCII sequence or like "000001110000001..." string?
What I'm thinking is like below, I don't know how to convert 32bits to string format same as file:read("*a") result.
#EgorSkriptunoff thank you, you opened a window for me. I wrote some new code, would you take a look, and I have another question, is the string merge method .. inefficient and expensive? Specially when one of the string is very large. Is there an alternative way to merge the bytes string?
NEW CODE UNDER #EgorSkriptunoff 's GUIDANCE
function _M.number_to_int32_bytes(num)
return ffi.string(ffi.new("int32_t[1]", num), 4)
end
local x, y = unpack(v)
local file, err = io.open(image_path, "rb")
if nil ~= file then
local image_raw_data = file:read("*a")
if nil == image_raw_data then
ngx.log(ngx.ERR, "read file error:", err)
else
-- Is the .. method inefficient and expensive? Because the image raw data maybe large,
-- so will .. copy all the data to a new string? Is there an alternative way to merge the bytes string?
output = utils.number_to_int32_bytes(x) .. utils.number_to_int32_bytes(y) .. image_raw_data
ngx.print(output)
ngx.flush(true)
end
file:close()
end
OLD CODE:
function table_merge(t1, t2)
for k,v in ipairs(t2) do
table.insert(t1, v)
end
return t1
end
function numToBits(num, bits)
-- returns a table of bits
local t={} -- will contain the bits
for b=bits,1,-1 do
rest=math.fmod(num,2)
t[b]=rest
num=(num-rest)/2
end
if num==0 then return t else return {'Not enough bits to represent this number'} end
end
-- Need to insert x,y as 32bits respectively to front of image binary sequence
function output()
local x = 1, y = 3
local file, err = io.open("/storage/images/1.png", "rb")
if nil ~= file then
local d = file:read("*a") ------- type(d) is string, why?
if nil == d then
ngx.log(ngx.ERR, "read file error:", err)
else
-- WHAT WAY I'M THINKING -----------------
-- Convert x, y to binary table, then merge them to one binary table
data = table_merge(numToBits(x, 32), numToBits(y, 32))
-- Convert data from binary table to string
data = convert_binary_table_to_string(data) -- HOW TO DO THAT? --------
-- Insert x,y data to front of image data, is data .. d ineffective?
data = data .. d
-------------------------------------------
ngx.print(data)
ngx.flush(true)
end
file:close()
end
end

VHDL - How to efficiently convert integer to ascii or 8-bit slv

I'm trying to output different (non-constant) values over serial. Serial communication is working fine but there doesn't seem to be an elegant, synthesizable way to convert any integer/natural/std_logic_vector/unsigned/signed type of any size and value to an array of 8-bit std_logic_vectors based on the ASCII table. That is super weird because what I'm trying to do is not uncommon.
One way I can do this is with big lookup tables or long, nested chains of if-elsif-else statements but that seems very inefficient and inelegant.
This doesn't sysnthesize:
eight_bit_result <= std_logic_vector(to_unsigned(character'pos(some_integer), 8));
ISE 14.7 breaks with a vague error caused by some header file. Even if it did work, it wouldn't work for values outside of 0-255. What is the right way to do this?
EDIT:
I wrote a quick and dirty function to cover integer values 0-9999. It needs no clocked process, no extra entity, etc. The proposed answers so far seem overly complicated.
function get_ascii_array_from_int(i : integer range 0 to 9999) return char_array is
variable result : char_array(0 to 3) := (x"30", x"30", x"30", x"30"); -- 0000
begin
if i >= 0 then
if i < 1000 then
result(0) := x"30"; -- 0
result(1 to 3) := get_ascii_array_from_int_hundreds(i);
elsif i < 2000 then
result(0) := x"31"; -- 1
result(1 to 3) := get_ascii_array_from_int_hundreds(i-1000);
elsif i < 3000 then
result(0) := x"32"; -- 2
result(1 to 3) := get_ascii_array_from_int_hundreds(i-2000);
elsif i < 4000 then
result(0) := x"33"; -- 3
result(1 to 3) := get_ascii_array_from_int_hundreds(i-3000);
elsif i < 5000 then
result(0) := x"34"; -- 4
result(1 to 3) := get_ascii_array_from_int_hundreds(i-4000);
elsif i < 6000 then
result(0) := x"35"; -- 5
result(1 to 3) := get_ascii_array_from_int_hundreds(i-5000);
elsif i < 7000 then
result(0) := x"36"; -- 6
result(1 to 3) := get_ascii_array_from_int_hundreds(i-6000);
elsif i < 8000 then
result(0) := x"37"; -- 7
result(1 to 3) := get_ascii_array_from_int_hundreds(i-7000);
elsif i < 9000 then
result(0) := x"38"; -- 8
result(1 to 3) := get_ascii_array_from_int_hundreds(i-8000);
else
result(0) := x"39"; -- 9
result(1 to 3) := get_ascii_array_from_int_hundreds(i-9000);
end if;
else
result := (x"6e", x"65", x"67", x"23"); -- "neg#"
end if;
return result;
end get_ascii_array_from_int;
As you may have guessed there are three other methods: get_ascii_array_from_int_hundreds
get_ascii_array_from_int_tens
get_ascii_array_from_int_ones
Say what you will about the this function but keep in mind that it produces the correct result when I need it, not several cycles later, and is simple and straightforward. I can easily expand it to cover negative numbers and larger numbers.
Is it an absolute requirement to output in decimal?
Hexadecimal output is the way to go if you want efficiency.
To crete hexadecimal output, you just split the integer up in groups of 4 bits, and convert each group to ASCII chars 0-F.
Converting a int to decimal ASCII representation is a computationaly heavy operation, as it requires many divisions.
But if this is what you need, one of the more efficient way to do it would be to create a state-machine which calculates one digit each clock cycle. The state machine needs to perform the following operations each clock cycle:
Divide integer X by 10 and store result back in X.
Calculate remainder.
Produce ASCII digit by adding x'30' to the remainder.
The statemachine would have to run for 10 clock cycles to convert a 32bit integer to ASCII decimal.
You need to design one small circuit as described in this paper, and use it into a interger to BCD converter, see the example for values 0 to 9999 on page 2, this will suit your needs.
M. Benedek, "Developing Large Binary to BCD Conversion Structures", Proceedings of the 3rd IEEE Symposium on Computer Arithmetic, Southern Methodist University, Dallas, Texas, November 19-20, 1975
http://www.acsel-lab.com/arithmetic/arith3/papers/ARITH3_Benedek.pdf

Set slice index using reflect in Go

I'm in Go, working with a reflect.Value representation of a slice. I have the following:
slice := reflect.MakeSlice(typ, len, cap)
If I want to get the ith value from slice, it's simple:
v := slice.Index(i) // returns a reflect.Value
However, I can't seem to find a way to set the ith value. reflect.Value has lots of setter methods, for example, if I had a map, m, the following is possible:
m.SetMapIndex(key, value) // key and value have type reflect.Value
But there doesn't seem to be an equivalent for slices. My one thought was that maybe the value returned from slice.Index(i) is actually a pointer somehow, so calling v := slice.Index(i); v.Set(newV) would work? I'm not sure. Ideas?
Figured it out! Turns out I posted this prematurely - my guess that slice.Index(0) returns a pointer was correct. In particular:
one := reflect.ValueOf(int(1))
slice := reflect.MakeSlice(reflect.TypeOf([]int{}), 1, 1)
v := slice.Index(0)
fmt.Println(v.Interface())
v.Set(one)
fmt.Println(v.Interface())
v = slice.Index(0)
fmt.Println(v.Interface())
prints:
0
1
1
(Here's runnable code on the go playground)
This might help:
n := val.Len()
if n >= val.Cap() {
ncap := 2 * n
if ncap < 4 {
ncap = 4
}
nval := reflect.MakeSlice(val.Type(), n, ncap)
reflect.Copy(nval, val)
val.Set(nval)
}
val.SetLen(n + 1)
// ...
val.Index(n).SetString("value") // Depends on type
Taken from a library I wrote a while back github.com/webconnex/xmlutil, specifically decode.go.

Resources