Are there any current non-Harvard architecture microcontrollers? - microcontroller

I have used and like the Atmel ATMEGA and ATTINY series microcontrollers, and think them quite good. One thing I am not terribly fond of though is the fact that they (and Microchip PIC uC family also) are all Harvard machines, meaning I can't really put external memory to use or execute out of RAM, only the flash.
While there are obvious advantages to this design, it makes it technically very difficult to do things like FORTH using an AVR or PIC. (I know there is at least one implementation, but it does not work like a normal FORTH and will wear out the flash rather rapidly)
FORTH was originally created for interactive machine control type systems where lots of flexibility was needed, so things like the Z80 or 6809 were used as microcontrollers with the control program executing out or RAM or some other storage device.
Does anyone know of current devices of similar complexity (preferably available in DIP packages) to the AVR/PIC that are von Neumman machines?

In addition to Freescale processors (that starblue has already pointed out), the Texas Instrument MSP430 family uses von Neumann architecture. However only the smallest ones are available in a DIP package.
UPDATE to include PIC32:
In my original post, I had forgotten that PIC32 microcontrollers have always been able to execute out of RAM, as demonstrated by this code example;and now Microchip has come out with the new PIC32MZ line of microcontrollers, with up to 2 MB of Flash and 512K of RAM which makes them feasible for fairly large RAM-based programs. Unfortunately none of them chips are available in DIP packages.
However Olimex, sort of the Bulgarian equivalent of SparkFun and Adafruit, has a PIC32-HMZ144 development board for $21.95 EUR, which is about $24. This is a smoking hot deal since the processor alone costs over $12 at Digi-Key. (There are other boards available from US suppliers from around $50 and up.)
The original PIC32MX line has twenty variants in 28-pin DIP packages, but they are limited to a maximum of 64K of RAM, still useful for some projects.

Farnell has a nice search function that let's you search for microcontrollers in DIP packages. Though you'll have figure out which families are non-Harvard by looking at the data sheets.
Take a look at the 68K ones and the HCS08.
Update: In the meantime some ARM Cortex-M controllers in DIP packages have become available, the LPC810M021FN8 and the LPC1114FN28 from NXP.

You might want to peruse the designs available at the OpenCores project. That is an open source project devoted to CPU core designs implemented in VHDL, Verilog, and similar FPGA design languages. There are complete and respectable implementations of classic 8-bit CPUs such as the 8080, 6502, and 8051. The 6502 I linked to claims to be cycle-accurate compared to the original chip. Others are functionally complete, but often have more modern buses and signals.
They won't (I think) be available in DIP packages, but you can always find breakout boards.
The designs are all open source, under a wide variety of licenses.

You may also have a look at the Zilog eZ80. Since they're binary-compatible with the old Z80, you should be able find a FORTH implementation that runs on them, but you'd probably need to run it on top of good old CP/M :)
Also, these are the only ones that I found that have the memory bus accessible from the outside, i.e. allow code execution from external memory.

The arm based ones, even the cortex-m3 claims to be harvard, but you can load programs into data ram and execute from that ram. it is really not harvard. Other arms are normally not harvard, some have external memory interfaces you can use to expand the internal resources.

This is actually not a question, but more of a related query. Why would you go to von-neumann in a microcontroller if the previous generation was harvard? Isnt it all win-win in terms of performance? other than complexity (which if the original PIC's can handle it, should not be that great) what are the downsides of having Harvard architecture?

The new Kinetis line of microcontrollers from Freescale puts an ARM Cortex-M4 inside a microcontroller package, and program code can be located anywhere in addressable space (RAM or FLASH, or even Flex Memory.)
The Kinetis Solution Advisor is a powerful selector guide that can help you find the micro you want. Memory from 32kB to 1MB, all the peripherals you could want, and pricing from under a dollar to around 10.

Related

Why do I have more flash (STM32F103RCT6) than what my datasheet says?

I'm writing a firmware to a STM32F103RCT6 microcontroller that has a flash of 256KB according to the datasheet.
Because of a mistake of mine, I was writing some data at 0x0807F800 that according to the reference manual is the last page of a high density device. (The ref. manual make no distinction of different sizes of 'high density devices' on the memory layout)
The data that I wrote, was being read with no errors, so I did some tests and read/wrote 512KB of random data and compared the files and they matched!
files hash pic
I did some research I couldn't find similar experiences.
Are those extra flash reliable? Is that some kind of industrial maneuver?
I would not recommend using this extra FLASH memory for anything that matters.
It is not guaranteed to be present on other chips with the same part number. If used in a product that would be a major problem. Even if a sample is successful now, the manufacturer could change the design or processes in the future and take it away.
While it might be perfectly fine on your chip, it could also be prone to corruption if there are weak memory cells.
A common practice in the semiconductor industry is to have several parts that share a common die design. After manufacturing, the dies are tested and sorted. A die might have a defect in a peripheral, so is used as a part that doesn't have that peripheral. Alternatively, it might be perfectly good, but used as a lesser part for business reasons (i.e., supply and demand).
Often, the unused features are disabled by cutting traces, burning fuses, or special programming at the factory, but it's possible extra features might left intact if there are no negative effects and are unlikely to be observed.
If this is only for one-off use or experimentation, and corruption is an acceptable condition, I don't really see a harm in using it.

how to reprogram an old computer rom and use it as rom memory for another task?

I've ripped open an old Pentium desktop. The main board is a Zida 5svx. I got to know from the manual (which i downloaded from the internet) the location of the ROM chip on the board, and took it out. It was mentioned in the manual that the chip was a Flash EEPROM.
Now, what I am interested in is this: Is there a way to erase the ROM and flash it with, say a C program to blink an LED (i know this might put you into a fit of laughter, but read on all the same), or control a motor?
I also want to know if I can construct a mega-sized micro-controller with the left-over Pentium, some MBs of RAM, and this ROM.
Any suggestions?
P.S: I know that such a uC will require appropriate power supply setup and things.
The key is in getting and deeply studying the manufacturer's datasheets for each device you remove and wish to reuse. I am supposing that since you asked the question that you did that you are not a professional electrical engineer - that's OK, but you will need to do hours, days, or weeks of study to truly understand the datasheets well enough to successfully reuse your motherboard chips because they are written for professional engineers with years of experience, and unfortunately they were not written to be understood by hobbyists. If you can succeed in acquiring and thoroughly understanding all of the datasheets (and the related user's guides as well for the more complex chips) then you have made it to the point where you might be able to start a custom design based on your recovered parts, on paper at least. In order to test your design and insure that each part of it is working will require at least an oscilloscope and volt meter - and the knowledge of how to use them. An understanding of basic electronics is essential, you will not succeed without it. Very good soldering/rework/assembly skills will be required as well if you hope to have your design truly work - you can do everything else right and it can still fail if your skills in this area are lacking. There is simply not enough time for me to advise you on everything you will need to know - but if you are motivated, dedicated, and you don't give up when setbacks and roadblocks occur (and trust me, they occur all too frequently for even the best engineers and best designs) - meaning that you are not easily frustrated when things don't work - then you have a chance at success. I wish you all the best, and try to have fun while doing it (important in case fun is all you ever get out of your project). :)

Developing with OpenCl on ATI and Nvidia on the same time

our workgroup is slowly trying a little bit of OpenCl in a side project. So far 'everybody' is working on NVIDIA Quadro FX 580. Now we are planning to buy new computers for new colleages and instead of the FX 580 we could buy ATI FirePro V4800 instead, which costs only 15Eur more and give us 1Gig instead of 512Gig of Ram which will benificial for our data intensive tasks.
So, how much trouble is it to develop OpenCl code at the same time on Nvidia and ATI?
I read the following SO question, Running OpenCL on hardware from mixed vendors, which was very pessimistic about developing on/for different vendors. On the other side, the question is already a year old.
What do you reccomend?
I have previous worked extensively with CUDA programming language.
I have been planning to start developing apps using OpenCL. As you mentioned one of the best features with OpenCL is running on many vendor hardware (Intel, AMD and Nvidia).
One project that I came across that used openCL extensively for large scale development is http://sourceforge.net/projects/hypgad/. It might be a good idea to look at the source code from this group and understand how they have developed their application on so many hardware including sony cell processor.
Another approach would be to use PyOPENCL, which provides higher abstraction than OpenCL and can significantly reduce the coding effort.
Do you need the code to run unchanged on both bits of hardware? If so you may have to develop for a limited subset of common functions.
If you can run slightly different c ode on each you will probably get better performance - in CUDA/OpenCL you generally have to tune the algorithms for the amount of ram, number of GPU engines anyway so it shoudldn't be much more work to also tweak for NVidia/AMD
The biggest problem is workgroup sizes. Some ATI cards I have used crash at above 64, but then it may be the Apple OSX 10.6 drivers I am using.
Developing for both ATI and NVIDIA is actually not too difficult so long as you avoid using any part of either vendor's SDK. Stick to OpenCL as it is defined in the OpenCL spec. (www.khronos.org/opencl) and your code will stay syntax portable. Due to differences in the underlying architectures, performance portability may be an issue. Local & Global worksizes really have to be determined independently for each card to maximize performance. Another thing to pay attention to is the types being used. Vector types (float2, float4) are especially useful on ATI cards, as each processing element actually contains 4 execution units (one for each RGB color channel, plus aplha).

Fast interpreted language for memory constrained microcontroller

I'm looking for a fast interpreted language for a microcontroller.
The requirements are:
should be fast (not crucial but would be nice)
should be light on data memory (small overhead <8KB, excludes program variable space)
preferably would be small in program size and the language would be compact
preferably, human readable (for example, BASIC)
Thanks!
Some AVR interpreters:
http://www.cqham.ru/tbcgroup/index_eng.htm
http://www.jcwolfram.de/projekte/avr/chipbasic2/main.php
http://www.jcwolfram.de/projekte/avr/chipbasic8/main.php
http://www.jcwolfram.de/projekte/avr/main.php
http://code.google.com/p/python-on-a-chip/
http://www.avrfreaks.net/index.php?module=Freaks%20Academy&func=viewItem&item_id=688&item_type=project
http://www.avrfreaks.net/index.php?module=Freaks%20Academy&func=viewItem&item_id=626&item_type=project
http://www.avrfreaks.net/index.php?module=Freaks%20Academy&func=viewItem&item_id=460&item_type=project
This is a bit generic: there are many kinds of Microcontrollers, and thanks to technologies like Jazelle, it is possible to run hardware-accelerated Java on Microcontrollers. (if... your microcontroller supports it)
For a generic answer: Forth is commonly referenced. But really, you need to be far more specific with your question.
Micro-controllers come in a vast variety of architectures. There are small 8-bit families, 32-bit families with simple architectures and 32-bit families with MMU support, suitable for running a modern OS. If you don't state which family you are targeted at, it is impossible to answer your question.
Anyway, for 8-bit families the best you can get is a BASIC variant. See Bascom for example. Note that this would be a compiler version of the "interpreted" language. If you actually want to have a runtime or an interpreter that will execute your code, then you most probably need to install an operation system in your microcontroller.
There were a variety of interpreted languages for small micros back in the late 1970's and 1980's. They seem to have mostly fallen out of fashion. I'd like to have a p-code based C compiler for the PIC18 that could coexist nicely with my other C compiler; for much of my code I'd be willing to accept a 100-fold slowdown for a 50% space reduction (so long as I could keep the important stuff in native code). I would think that would be achievable, but I'm not about to implement such a thing from scratch myself.

Hardware Programming - Hands-On Learning

Besides Arduino, what other ways are there to learn hardware programming in a hands-on way? Are there any nifty kits available, either a pre-assembled robot, that you can program to move a certain way, or do certain things, or anything similar to that?
Atmel AVR and the PIC both have experiment boards that you can use solder stuff on to, usually they have a couple of buttons and some lights pre-soldered to the area. This let's you program/flash the microprocessor and play with the output pins. You can either write the programs in assembly or C.
Parallax have a number of kits. They have two product lines suited for "playing around", Basic Stamp and something called Propeller. The former is a small microprocessor that runs programs written in Basic (a tad disgusting ;)) and the latter runs something called Spin or assembly (well after compilation obviously.)
I would go with either AVR or the PIC. I've done PIC but I've heard good things about AVR, they seem to ship with better software.
At first look Microsoft's VPL sounds good, but when it comes to actually LEARNING how hardware works it goes a LONG way to hide those details from you. As a matter of fact it is pretty much designed for people who don't program, and is distastful to someone who's actually written embedded software. IF you just want to make stuff happen and not delve into the details it's fine, but if you want to get down to the metal like programming the "Arduino" boards it's not for you.
If you're used to something like the Arduino then something like the PIC will be an easy transistion. SparcFun Electronics has all sorts of DIY type projects and hardware available. If you have a decent bookstore around your area, I would suggest looking for "Circuit Cellar" magazine. It has articles on a monthly basis with project for someone looking to get into hardware projects, everything from homebrew Software Defined Radio to FPGA based 3D graphics. (Raytracing actually) Usually the authors describe the project in an article and "WHY" they made the decisions they did, a description and schematics of the hardware and provide a link to source code.
Cypress Semiconductor has one of the most interesting embedded processors on the market and several high quality dev boards for sale. The PSoC includes the ability to not only configure the software, but also to "drop in" software configured hardware such Analog to digital converters, serial I/O, Digital to Analog and Various amps and filters. It's a REALLY cool concept, and the "touch sensor" capability of the PSoC were actually used in several models of the IPod.
One thing about programming these little micros is they don't have a lot between you and the hardware, you get to see how things really work. It doesn't matter whether you're talking about an 8-bit microcontroller or a quad-core Pentium programming hardware is largely the same concept. You write to a memory mapped register for some piece of hardware like a serial controller, and the hardware responds in someway. If you program a baudrate generator in a PIC or PC it's largely the same idea, you write a value that will be used as a division factor from a given clock to achive a given baudrate. The numbers and names maybe different, but the concepts is the same. On a PC you may have to map to the PCI address of the card, which adds a some complications, but if you looked underneath the OS you would see that that was done just by writing values to registers simalar to programming a PIC to use a different "Page" of memory. Is it worth learning an 8-bitter? Well, there are approximately $5 billion dollars in sales of the little 8-bit micros today with projection only showing growth in that market in the future. I saw one reference that state the average car has 25 Microcontrollers in it. That's not too bad.
I haven't played with it much, but the iRobot looks pretty cool.
The ability to simulate how your robot will work which some of the other answers mentioned is nice, but there's nothing like seeing a real-life robot do what you programmed it to do. That, to me, is what really makes robots fun and cool.
There's the .NET Micro Framework.
It's incredibly simple to use/setup and there's lots of hardware being made to target this framework.
You should take a look at Microsoft Robotics Developer Studio which supports many different kits.
I have always been curious about gumstix. It seems more professional than arduino, and it aims at the Linux programmer. I cannot give you a real suggestion, as I've never played with it, but I would definitely go with one of this toys if I had to do and learn some cool hardware programming.

Resources