Unicode map from Font Awesome 4 to Font Awesome 5 - css

I have several css files that contain hard-coded FA 4 Unicode values. Does there exist a map from FA 4 Unicode to FA 5 Unicode?

To create the map, I had to use three different tables:
https://fontawesome.com/v4.7.0/cheatsheet/
https://fontawesome.com/cheatsheet
https://fontawesome.com/how-to-use/on-the-web/setup/upgrading-from-version-4
With the help of some MySQL queries, I created the following map that relates the font awesome 4 name and Unicode value to its respective font awesome 5 name and Unicode value only if the Unicode values differ:
"fa4name","fa5name","fa4code","fa5code"
"fa-address-book-o","address-book","f2ba","f2b9"
"fa-address-card-o","address-card","f2bc ","f2bb"
"fa-arrow-circle-o-down","arrow-alt-circle-down","f01a","f358"
"fa-arrow-circle-o-left","arrow-alt-circle-left","f190","f359"
"fa-arrow-circle-o-right","arrow-alt-circle-right","f18e ","f35a"
"fa-arrow-circle-o-up","arrow-alt-circle-up","f01b ","f35b"
"fa-arrows","arrows-alt","f047 ","f0b2"
"fa-arrows-alt","expand-arrows-alt","f0b2 ","f31e"
"fa-arrows-h","arrows-alt-h","f07e ","f337"
"fa-arrows-v","arrows-alt-v","f07d","f338"
"fa-bell-o","bell","f0a2","f0f3"
"fa-bell-slash-o","bell-slash","f1f7","f1f6"
"fa-bookmark-o","bookmark","f097","f02e"
"fa-building-o","building","f0f7 ","f1ad"
"fa-check-circle-o","check-circle","f05d ","f058"
"fa-check-square-o","check-square","f046 ","f14a"
"fa-circle-o","circle","f10c","f111"
"fa-circle-thin","circle","f1db ","f111"
"fa-clipboard","clipboard","f0ea ","f328"
"fa-cloud-download","cloud-download-alt","f0ed ","f381"
"fa-cloud-upload","cloud-upload-alt","f0ee ","f382"
"fa-comment-o","comment","f0e5","f075"
"fa-commenting","comment-dots","f27a","f4ad"
"fa-commenting-o","comment-dots","f27b ","f4ad"
"fa-comments-o","comments","f0e6 ","f086"
"fa-credit-card-alt","credit-card","f283 ","f09d"
"fa-cutlery","utensils","f0f5 ","f2e7"
"fa-diamond","gem","f219","f3a5"
"fa-envelope-o","envelope","f003","f0e0"
"fa-envelope-open-o","envelope-open","f2b7","f2b6"
"fa-exchange","exchange-alt","f0ec ","f362"
"fa-external-link","external-link-alt","f08e ","f35d"
"fa-external-link-square","external-link-square-alt","f14c ","f360"
"fa-folder-o","folder","f114 ","f07b"
"fa-folder-open-o","folder-open","f115 ","f07c"
"fa-heart-o","heart","f08a","f004"
"fa-hourglass-o","hourglass","f250","f254"
"fa-id-card-o","id-card","f2c3","f2c2"
"fa-level-down","level-down-alt","f149 ","f3be"
"fa-level-up","level-up-alt","f148","f3bf"
"fa-long-arrow-down","long-arrow-alt-down","f175 ","f309"
"fa-long-arrow-left","long-arrow-alt-left","f177 ","f30a"
"fa-long-arrow-right","long-arrow-alt-right","f178 ","f30b"
"fa-long-arrow-up","long-arrow-alt-up","f176","f30c"
"fa-map-marker","map-marker-alt","f041","f3c5"
"fa-map-o","map","f278","f279"
"fa-minus-square-o","minus-square","f147","f146"
"fa-mobile","mobile-alt","f10b ","f3cd"
"fa-money","money-bill-alt","f0d6 ","f3d1"
"fa-paper-plane-o","paper-plane","f1d9 ","f1d8"
"fa-pause-circle-o","pause-circle","f28c","f28b"
"fa-pencil","pencil-alt","f040 ","f303"
"fa-play-circle-o","play-circle","f01d","f144"
"fa-plus-square-o","plus-square","f196","f0fe"
"fa-question-circle-o","question-circle","f29c","f059"
"fa-share-square-o","share-square","f045","f14d"
"fa-shield","shield-alt","f132","f3ed"
"fa-sign-in","sign-in-alt","f090","f2f6"
"fa-sign-out","sign-out-alt","f08b ","f2f5"
"fa-spoon","utensil-spoon","f1b1","f2e5"
"fa-square-o","square","f096","f0c8"
"fa-star-half-o","star-half","f123 ","f089"
"fa-star-o","star","f006","f005"
"fa-sticky-note-o","sticky-note","f24a ","f249"
"fa-stop-circle-o","stop-circle","f28e","f28d"
"fa-tablet","tablet-alt","f10a ","f3fa"
"fa-tachometer","tachometer-alt","f0e4 ","f3fd"
"fa-thumbs-o-down","thumbs-down","f088 ","f165"
"fa-thumbs-o-up","thumbs-up","f087 ","f164"
"fa-ticket","ticket-alt","f145 ","f3ff"
"fa-times-circle-o","times-circle","f05c","f057"
"fa-trash","trash-alt","f1f8 ","f2ed"
"fa-trash-o","trash-alt","f014","f2ed"
"fa-user-circle-o","user-circle","f2be ","f2bd"
"fa-user-o","user","f2c0","f007"
"fa-window-close-o","window-close","f2d4","f410"
"fa-calendar","calendar","f073","f133"
"fa-reply","reply","f112 ","f3e5"
"fa-window-close","window-close","f2d3","f410"
The first row has the column description.
Note that the table, upgrading-from-version-4, only lists changes in the name; it was necessary to also join on the two different cheat-sheets listed above where the names are the same, but the Unicode values differ.

Related

Error in R - more columns than column names

I am trying to read in a file that has 5 column headers, however in column 5 I have list of genes separated commas.
EC2 <- read.table("david.txt", header=TRUE)
Whenever I run the code below, I get the message
"more columns than column names."
I feel like the answer is probably simple. Any idea?
These are the first 3 lines:
Category ID Term PValue Genes
BP GO: 0006412 translation 2.711930356491234E-10 P0A7U3, P0A7K6, P68191, P0A7Q1, P0A7U7, P02359, P02358, P60438, P0A7L0, P0A7L3, P0A7L8, P0A7T3, P0A8A8, P69441, P0A8N5, P0A8N3, P02413, P0A7T7, P0AG63, P0A7D1, P0AA10 , P0ADY3, P0AG67, P0A7M2, P0A898, P0A9W3, P0A7M6, P0A7X3, P0AAR3, P0A7S3, P0A7S9, P0ADY7, P62399, P60624, P32132, P0ADZ4, P60723, P0C0U4, P0AG51, P0ADZ0, P0A7N9, P0A7J3, P0A7W7, P0AG59, P68679, P0C018 , P0A7R1, P0A7N4, P0A7R5, P0A7R9, P0AG44, P68919, P61175, P0A6K3, P0A7V0, P0A7M9, P0A7K2, P0A7V3, P0AG48
BP GO: 0051301 cell division 1.4011247561051483E-7 P0AC30, P17952, P75949, P0A6H1, P06966, P0A9R7, P64612, P36548, P60472, P45955, P0A855, P06136, P0A850, P6246, P0246, P024 P22523, P08373, P11880, P0AFB1, P60293, P18196, P0ABG4, P07026, P0A749, P29131, P0A6S5, P26648, P17443, P0ADS2, P0A8P6, P0A8P8, P0A6, P0A6A7, P0A8P8, P0A6, P0A6A7, P0A6, P0A6A7 P46889, P0A6F9, P0AE60, P0AD68, P19934, P0ABU9, P37773

Issues with importing R Data due to formatting

I'm trying to import txt data into R; however, due to the txt file's unique formatting, I'm unsure of how to do this. I definitely feel that the issue is related to the fact that the txt file was formatted to line up columns with column names; however, as it's a text file, this was done with a variety of spaces. For example:
Gene Chromosomal Swiss-Prot MIM Description
name position AC Entry name code
______________ _______________ ______________________ ______ ______________________
A3GALT2 1p35.1 U3KPV4 A3LT2_HUMAN Alpha-1,3-galactosyltransferase 2 (EC 2.4.1.87) (Isoglobotriaosylceramide synthase) (iGb3 synthase) (iGb3S) [A3GALT2P] [IGBS3S]
AADACL3 1p36.21 Q5VUY0 ADCL3_HUMAN Arylacetamide deacetylase-like 3 (EC 3.1.1.-)
AADACL4 1p36.21 Q5VUY2 ADCL4_HUMAN Arylacetamide deacetylase-like 4 (EC 3.1.1.-)
ABCA4 1p21-p22.1 P78363 ABCA4_HUMAN 601691 Retinal-specific phospholipid-transporting ATPase ABCA4 (EC 7.6.2.1) (ATP-binding cassette sub-family A member 4) (RIM ABC transporter) (RIM protein) (RmP) (Retinal-specific ATP-binding cassette transporter) (Stargardt disease protein) [ABCR]
ABCB10 1q42 Q9NRK6 ABCBA_HUMAN 605454 ATP-binding cassette sub-family B member 10, mitochondrial precursor (ATP-binding cassette transporter
Because of this, I have not been able to import my data whatsoever. Because it was made to be justified text with spaces, the number of spaces aren't uniform at all.
This is the link to the data sheet that I am using: https://www.uniprot.org/docs/humchr01.txt
Each field has a fixed width. Therefore, you can use the function read.fwf to read the file.
The following code reads the input file (assuming the file has only the rows, without the headers)
f = read.fwf('input.txt', c(14,16,11,12,7,250), strip.white=T)
colnames(f) = c('Gene name', 'Chromosomal position', 'Swiss-Prot AC',
'Swiss-Prot Entry name', 'MIM code', 'Description')

How to access a particular sub-set of data in R Table

I have tabular (long format) data with a number of variables. I want to load the csv once and then access a particular sub-set later on from it. For example:
Blog,Region,Dim1
Individual,PK,-4.75
Individual,PK,-5.69
Individual,PK,-0.27
Individual,PK,-2.76
Individual,PK,-8.24
Individual,PK,-12.51
Individual,PK,-1.28
Individual,PK,0.95
Individual,PK,-5.96
Individual,PK,-8.81
Individual,PK,-8.46
Individual,PK,-6.15
Individual,PK,-13.98
Individual,PK,-16.43
Individual,PK,-4.09
Individual,PK,-11.06
Individual,PK,-9.04
Individual,PK,-8.56
Individual,PK,-8.13
Individual,PK,-14.46
Individual,PK,-4.21
Individual,PK,-4.96
Individual,PK,-5.48
Multiwriter,PK,-3.31
Multiwriter,PK,-5.62
Multiwriter,PK,-4.48
Multiwriter,PK,-6.08
Multiwriter,PK,-4.68
Multiwriter,PK,-6.92
Multiwriter,PK,-11.29
Multiwriter,PK,6.66
Multiwriter,PK,1.66
Multiwriter,PK,3.39
Multiwriter,PK,0.06
Multiwriter,PK,4.11
Multiwriter,PK,-1.57
Multiwriter,PK,1.33
Multiwriter,PK,-6.91
Multiwriter,PK,4.87
Multiwriter,PK,-10.87
Multiwriter,PK,6.25
Multiwriter,PK,-0.68
Multiwriter,PK,0.11
Multiwriter,PK,0.71
Multiwriter,PK,-3.8
Multiwriter,PK,-1.75
Multiwriter,PK,-5.38
Multiwriter,PK,1.24
Multiwriter,PK,-5.59
Multiwriter,PK,4.98
Multiwriter,PK,0.98
Multiwriter,PK,7.47
Multiwriter,PK,-5.25
Multiwriter,PK,-14.24
Multiwriter,PK,-1.55
Multiwriter,PK,-8.44
Multiwriter,PK,-7.67
Multiwriter,PK,5.85
Multiwriter,PK,6
Multiwriter,PK,-7.53
Multiwriter,PK,1.59
Multiwriter,PK,-9.48
Multiwriter,PK,-3.99
Multiwriter,PK,-5.82
Multiwriter,PK,1.62
Multiwriter,PK,-4.14
Multiwriter,PK,1.06
Multiwriter,PK,4.52
Multiwriter,PK,-5.6
Multiwriter,PK,-3.38
Multiwriter,PK,4.82
Multiwriter,PK,0.76
Multiwriter,PK,-4.95
Multiwriter,PK,-2.05
Column,PK,1.64
Column,PK,5.2
Column,PK,2.8
Column,PK,1.93
Column,PK,2.36
Column,PK,4.77
Column,PK,-1.92
Column,PK,-2.94
Column,PK,4.58
Column,PK,2.98
Column,PK,9.07
Column,PK,8.5
Column,PK,1.23
Column,PK,8.97
Column,PK,4.1
Column,PK,7.25
Column,PK,0.02
Column,PK,-3.48
Column,PK,1.01
Column,PK,2.7
Column,PK,-2.32
Column,PK,3.22
Column,PK,-2.37
Column,PK,-13.28
Column,PK,-4.36
Column,PK,2.91
Column,PK,4.4
Column,PK,-5.07
Column,PK,-10.24
Column,PK,12.8
Column,PK,1.92
Column,PK,13.24
Column,PK,12.32
Column,PK,12.7
Column,PK,9.95
Column,PK,12.11
Column,PK,7.63
Column,PK,11.09
Column,PK,13.04
Column,PK,12.06
Column,PK,9.49
Column,PK,8.64
Column,PK,10.05
Column,PK,6.4
Column,PK,9.64
Column,PK,3.53
Column,PK,4.78
Column,PK,9.54
Column,PK,8.49
Column,PK,2.56
Column,PK,8.82
Column,PK,-3.59
Column,PK,-3.31
Column,PK,10.05
Column,PK,-0.28
Column,PK,-0.5
Column,PK,-6.37
Column,PK,2.97
Column,PK,4.49
Column,PK,9.14
Column,PK,4.5
Column,PK,8.6
Column,PK,6.76
Column,PK,3.67
Column,PK,6.79
Column,PK,5.77
Column,PK,10.5
Column,PK,1.57
Column,PK,9.47
Individual,US,-9.85
Individual,US,-2.73
Individual,US,-0.32
Individual,US,-0.94
Individual,US,-7.51
Individual,US,-8.21
Individual,US,-7.33
Individual,US,-5.1
Individual,US,-1.58
Individual,US,-2.49
Individual,US,-1.36
Individual,US,-5.76
Individual,US,-0.48
Individual,US,-3.38
Individual,US,2.42
Individual,US,-1.71
Individual,US,-2.17
Individual,US,-2.81
Individual,US,-0.64
Individual,US,-8.88
Individual,US,-1.53
Individual,US,-1.42
Individual,US,-17.89
Individual,US,7.1
Individual,US,-4.12
Individual,US,-0.83
Individual,US,2.05
Individual,US,-5.87
Individual,US,-0.15
Individual,US,5.78
Individual,US,-1.96
Individual,US,1.77
Individual,US,-0.67
Individual,US,-10.23
Individual,US,3.37
Individual,US,-1.18
Individual,US,6.94
Individual,US,-3.86
Individual,US,2.21
Individual,US,-11.64
Individual,US,-14.71
Individual,US,-12.74
Individual,US,-6.24
Individual,US,-13.64
Individual,US,-8.53
Individual,US,-10.4
Individual,US,-6.24
Individual,US,-12.15
Individual,US,-15.96
Multiwriter,US,11.27
Multiwriter,US,3.51
Multiwriter,US,4.05
Multiwriter,US,3.81
Multiwriter,US,8.56
Multiwriter,US,6.36
Multiwriter,US,-8.99
Multiwriter,US,3.36
Multiwriter,US,3.18
Multiwriter,US,-5.22
Multiwriter,US,-8.61
Multiwriter,US,-9.02
Multiwriter,US,-6.32
Multiwriter,US,0.53
Multiwriter,US,11.03
Multiwriter,US,-5.7
Multiwriter,US,4
Multiwriter,US,-3.55
Multiwriter,US,2.79
Multiwriter,US,4.61
Multiwriter,US,-3.8
Multiwriter,US,-9.62
Multiwriter,US,-8.37
Multiwriter,US,-2.18
Multiwriter,US,-1.64
Multiwriter,US,-9.99
Multiwriter,US,-1.44
Multiwriter,US,-4.45
Multiwriter,US,-7.84
Multiwriter,US,-11.6
Multiwriter,US,-2.71
Multiwriter,US,1.2
Multiwriter,US,-6.44
Multiwriter,US,-2.64
Multiwriter,US,-11.59
Multiwriter,US,-5.9
Multiwriter,US,-3.78
Multiwriter,US,-14.99
Multiwriter,US,1.32
Multiwriter,US,-6.55
Multiwriter,US,0.92
Multiwriter,US,-5.61
Multiwriter,US,-14.16
Multiwriter,US,-10.03
Multiwriter,US,-7.08
Multiwriter,US,0.62
Multiwriter,US,-5.43
Multiwriter,US,-1.11
Multiwriter,US,-11.37
Multiwriter,US,-13.37
Multiwriter,US,-12.71
Multiwriter,US,1.86
Multiwriter,US,14.11
Multiwriter,US,-5.24
Multiwriter,US,-6.77
Multiwriter,US,-4.79
Multiwriter,US,-6.22
Multiwriter,US,3.66
Multiwriter,US,-2.65
Multiwriter,US,-2.87
Multiwriter,US,-12.32
Multiwriter,US,-7.48
Multiwriter,US,-4.84
Multiwriter,US,0.44
Column,US,8.93
Column,US,10.29
Column,US,8.31
Column,US,5.88
Column,US,8.87
Column,US,-2.9
Column,US,3.71
Column,US,8.43
Column,US,1.47
Column,US,3.05
Column,US,-1.78
Column,US,1.14
Column,US,7.2
Column,US,5.22
Column,US,5.53
Column,US,8.14
Column,US,-2.22
Column,US,0.89
Column,US,2.5
Column,US,6.77
Column,US,3.63
Column,US,2.86
Column,US,3.7
Column,US,7.52
Column,US,3.12
Column,US,0
Column,US,0.28
Column,US,6.86
Column,US,-0.32
Column,US,2.92
Column,US,-1.14
Column,US,-1.11
Column,US,4.42
Column,US,4.37
Column,US,1.09
Column,US,-3.66
Column,US,7.09
Column,US,-11.02
Column,US,-0.78
Column,US,8.44
Column,US,4.88
Column,US,-3.9
Column,US,-0.21
Column,US,6.48
Column,US,4.49
Column,US,-8.89
Column,US,-0.73
Column,US,1.76
Column,US,-4.31
Column,US,4.63
Column,US,8.91
Column,US,3.55
Column,US,6.69
Column,US,-4.45
Column,US,9.82
Column,US,6.79
Column,US,1.84
Column,US,8.97
Column,US,2.38
Column,US,4.68
Column,US,9.23
Column,US,2.85
Column,US,4.19
Column,US,2.43
Column,US,5.48
Column,US,-1.08
Column,US,7.47
Column,US,3.13
Column,US,-0.42
Column,US,-0.71
Column,US,6.51
Column,US,6.34
Column,US,3.94
Column,US,5.46
Column,US,0.39
Column,US,8.15
Column,US,7.99
Column,US,6.26
Column,US,7.91
Column,US,14.18
Column,US,7.41
Column,US,7.16
Column,US,5.6
Column,US,7.51
Column,US,6.24
Column,US,3.67
Column,US,3.84
Column,US,2.37
Column,US,-3.5
Column,US,5.02
Column,US,-6.04
Column,US,5.36
Column,US,1.98
Column,US,7.79
Column,US,0.02
Column,US,-1.9
Column,US,-2.81
Column,US,10.69
Column,US,1.65
Column,US,8.19
Column,US,1.92
How can I access values related to 'Column' with 'US' subset from 'Dim1'?
I have tried to read about 'data frame, table, factor' and 'matrix' data types in R, but I could not find help how to access a subset of a complex table like this. (My real data includes additional vectors of numerical values like Dim1... i.e. Dim2, Dim3, Dim4, Dim5). But that should be the same in principle so I have not included that in this example.
I assume you want to select only the rows which have 'Column' and 'US'.
If so you can select the subset using:
data[data[,1]=='Column' & data[,2]=='US',]

Pyparsing - name not starting with a character

I am trying to use Pyparsing to identify a keyword which is not beginning with $ So for the following input:
$abc = 5 # is not a valid one
abc123 = 10 # is valid one
abc$ = 23 # is a valid one
I tried the following
var = Word(printables, excludeChars='$')
var.parseString('$abc')
But this doesn't allow any $ in var. How can I specify all printable characters other than $ in the first character position? Any help will be appreciated.
Thanks
Abhijit
You can use the method I used to define "all characters except X" before I added the excludeChars parameter to the Word class:
NOT_DOLLAR_SIGN = ''.join(c for c in printables if c != '$')
keyword_not_starting_with_dollar = Word(NOT_DOLLAR_SIGN, printables)
This should be a bit more efficient than building up with a Combine and a NotAny. But this will match almost anything, integers, words, valid identifiers, invalid identifiers, so I'm skeptical of the value of this kind of expression in your parser.

Display Arabic text as separate characters (instead of cursive script) using CSS

To display license plates in Arabic, I wish to have each letter displayed without joining adjacent characters.
Is it possible to display Arabic text as separate characters, without the cursive script?
There are unique isolated Arabic UTF-8 characters just for this type of purpose.
It's all explained in this Wikipedia page.
(sorry, it pasted in as a bit of a mess)
A demonstration for the basic alphabet used in Modern Standard Arabic:
General
Unicode Contextual forms Name
Isolated End Middle Beginning
0623
أ‎ FE83
أ‎ FE84
ـأ‎ ʾalif
0628
ب‎ FE8F
ﺏ‎ FE90
ـب‎ FE92
ـبـ‎ FE91
بـ‎ bāʾ
062A
ت‎ FE95
ﺕ‎ FE96
ـت‎ FE98
ـتـ‎ FE97
تـ‎ tāʾ
062B
ث‎ FE99
ﺙ‎ FE9A
ـث‎ FE9C
ـثـ‎ FE9B
ثـ‎ ṯāʾ
062C
ج‎ FE9D
ﺝ‎ FE9E
ـج‎ FEA0
ـجـ‎ FE9F
جـ‎ ǧīm
062D
ح‎ FEA1
ﺡ‎ FEA2
ـح‎ FEA4
ـحـ‎ FEA3
حـ‎ ḥāʾ
062E
خ‎ FEA5
ﺥ‎ FEA6
ـخ‎ FEA8
ـخـ‎ FEA7
خـ‎ ḫāʾ
062F
د‎ FEA9
ﺩ‎ FEAA
ـد‎ dāl
0630
ذ‎ FEAB
ﺫ‎ FEAC
ـذ‎ ḏāl
0631
ر‎ FEAD
ﺭ‎ FEAE
ـر‎ rāʾ
0632
ز‎ FEAF
ﺯ‎ FEB0
ـز‎ zayn/zāy
0633
س‎ FEB1
ﺱ‎ FEB2
ـس‎ FEB4
ـسـ‎ FEB3
سـ‎ sīn
0634
ش‎ FEB5
ﺵ‎ FEB6
ـش‎ FEB8
ـشـ‎ FEB7
شـ‎ šīn
0635
ص‎ FEB9
ﺹ‎ FEBA
ـص‎ FEBC
ـصـ‎ FEBB
صـ‎ ṣād
0636
ض‎ FEBD
ﺽ‎ FEBE
ـض‎ FEC0
ـضـ‎ FEBF
ضـ‎ ḍād
0637
ط‎ FEC1
ﻁ‎ FEC2
ـط‎ FEC4
ـطـ‎ FEC3
طـ‎ ṭāʾ
0638
ظ‎ FEC5
ﻅ‎ FEC6
ـظ‎ FEC8
ـظـ‎ FEC7
ظـ‎ ẓāʾ
0639
ع‎ FEC9
ﻉ‎ FECA
ـع‎ FECC
ـعـ‎ FECB
عـ‎ ʿayn
063A
غ‎ FECD
ﻍ‎ FECE
ـغ‎ FED0
ـغـ‎ FECF
غـ‎ ġayn
0641
ف‎ FED1
ف‎ FED2
ـف‎ FED4
ـفـ‎ FED3
فـ‎ fāʾ
0642
ق‎ FED5
ﻕ‎ FED6
ـق‎ FED8
ـقـ‎ FED7
قـ‎ qāf
0643
ك‎ FED9
ﻙ‎ FEDA
ـك‎ FEDC
ـكـ‎ FEDB
كـ‎ kāf
0644
ل‎ FEDD
ﻝ‎ FEDE
ـل‎ FEE0
ـلـ‎ FEDF
لـ‎ lām
0645
م‎ FEE1
ﻡ‎ FEE2
ـم‎ FEE4
ـمـ‎ FEE3
مـ‎ mīm
0646
ن‎ FEE5
ن‎ FEE6
ـن‎ FEE8
ـنـ‎ FEE7
نـ‎ nūn
0647
ﻫ‎ FEE9
ﻩ‎ FEEA
ـه‎ FEEC
ـهـ‎ FEEB
هـ‎ hāʾ
0648
و‎ FEED
ﻭ‎ FEEE
ـو‎ wāw
064A
ي‎ FEF1
ﻱ‎ FEF2
ـي‎ FEF4
ـيـ‎ FEF3
يـ‎ yāʾ
0622
آ‎ FE81
ﺁ‎ FE82
ـآ‎ ʾalif maddah
0629
ة‎ FE93
ﺓ‎ FE94
ـة‎ — — Tāʾ marbūṭah
0649
ى‎ FEEF
ﻯ‎ FEF0
ـى‎ — — ʾalif maqṣūrah
[edit]

Resources