Hardware Programming - Hands-On Learning - arduino

Besides Arduino, what other ways are there to learn hardware programming in a hands-on way? Are there any nifty kits available, either a pre-assembled robot, that you can program to move a certain way, or do certain things, or anything similar to that?

Atmel AVR and the PIC both have experiment boards that you can use solder stuff on to, usually they have a couple of buttons and some lights pre-soldered to the area. This let's you program/flash the microprocessor and play with the output pins. You can either write the programs in assembly or C.
Parallax have a number of kits. They have two product lines suited for "playing around", Basic Stamp and something called Propeller. The former is a small microprocessor that runs programs written in Basic (a tad disgusting ;)) and the latter runs something called Spin or assembly (well after compilation obviously.)
I would go with either AVR or the PIC. I've done PIC but I've heard good things about AVR, they seem to ship with better software.

At first look Microsoft's VPL sounds good, but when it comes to actually LEARNING how hardware works it goes a LONG way to hide those details from you. As a matter of fact it is pretty much designed for people who don't program, and is distastful to someone who's actually written embedded software. IF you just want to make stuff happen and not delve into the details it's fine, but if you want to get down to the metal like programming the "Arduino" boards it's not for you.
If you're used to something like the Arduino then something like the PIC will be an easy transistion. SparcFun Electronics has all sorts of DIY type projects and hardware available. If you have a decent bookstore around your area, I would suggest looking for "Circuit Cellar" magazine. It has articles on a monthly basis with project for someone looking to get into hardware projects, everything from homebrew Software Defined Radio to FPGA based 3D graphics. (Raytracing actually) Usually the authors describe the project in an article and "WHY" they made the decisions they did, a description and schematics of the hardware and provide a link to source code.
Cypress Semiconductor has one of the most interesting embedded processors on the market and several high quality dev boards for sale. The PSoC includes the ability to not only configure the software, but also to "drop in" software configured hardware such Analog to digital converters, serial I/O, Digital to Analog and Various amps and filters. It's a REALLY cool concept, and the "touch sensor" capability of the PSoC were actually used in several models of the IPod.
One thing about programming these little micros is they don't have a lot between you and the hardware, you get to see how things really work. It doesn't matter whether you're talking about an 8-bit microcontroller or a quad-core Pentium programming hardware is largely the same concept. You write to a memory mapped register for some piece of hardware like a serial controller, and the hardware responds in someway. If you program a baudrate generator in a PIC or PC it's largely the same idea, you write a value that will be used as a division factor from a given clock to achive a given baudrate. The numbers and names maybe different, but the concepts is the same. On a PC you may have to map to the PCI address of the card, which adds a some complications, but if you looked underneath the OS you would see that that was done just by writing values to registers simalar to programming a PIC to use a different "Page" of memory. Is it worth learning an 8-bitter? Well, there are approximately $5 billion dollars in sales of the little 8-bit micros today with projection only showing growth in that market in the future. I saw one reference that state the average car has 25 Microcontrollers in it. That's not too bad.

I haven't played with it much, but the iRobot looks pretty cool.
The ability to simulate how your robot will work which some of the other answers mentioned is nice, but there's nothing like seeing a real-life robot do what you programmed it to do. That, to me, is what really makes robots fun and cool.

There's the .NET Micro Framework.
It's incredibly simple to use/setup and there's lots of hardware being made to target this framework.

You should take a look at Microsoft Robotics Developer Studio which supports many different kits.

I have always been curious about gumstix. It seems more professional than arduino, and it aims at the Linux programmer. I cannot give you a real suggestion, as I've never played with it, but I would definitely go with one of this toys if I had to do and learn some cool hardware programming.

Related

how to reprogram an old computer rom and use it as rom memory for another task?

I've ripped open an old Pentium desktop. The main board is a Zida 5svx. I got to know from the manual (which i downloaded from the internet) the location of the ROM chip on the board, and took it out. It was mentioned in the manual that the chip was a Flash EEPROM.
Now, what I am interested in is this: Is there a way to erase the ROM and flash it with, say a C program to blink an LED (i know this might put you into a fit of laughter, but read on all the same), or control a motor?
I also want to know if I can construct a mega-sized micro-controller with the left-over Pentium, some MBs of RAM, and this ROM.
Any suggestions?
P.S: I know that such a uC will require appropriate power supply setup and things.
The key is in getting and deeply studying the manufacturer's datasheets for each device you remove and wish to reuse. I am supposing that since you asked the question that you did that you are not a professional electrical engineer - that's OK, but you will need to do hours, days, or weeks of study to truly understand the datasheets well enough to successfully reuse your motherboard chips because they are written for professional engineers with years of experience, and unfortunately they were not written to be understood by hobbyists. If you can succeed in acquiring and thoroughly understanding all of the datasheets (and the related user's guides as well for the more complex chips) then you have made it to the point where you might be able to start a custom design based on your recovered parts, on paper at least. In order to test your design and insure that each part of it is working will require at least an oscilloscope and volt meter - and the knowledge of how to use them. An understanding of basic electronics is essential, you will not succeed without it. Very good soldering/rework/assembly skills will be required as well if you hope to have your design truly work - you can do everything else right and it can still fail if your skills in this area are lacking. There is simply not enough time for me to advise you on everything you will need to know - but if you are motivated, dedicated, and you don't give up when setbacks and roadblocks occur (and trust me, they occur all too frequently for even the best engineers and best designs) - meaning that you are not easily frustrated when things don't work - then you have a chance at success. I wish you all the best, and try to have fun while doing it (important in case fun is all you ever get out of your project). :)

Reading and understanding MCU datasheet and codes

Are there any tips for reading source code samples from manufacturer of MCUs'.
I am a newbie for mcu programming, currently I have a MCU, datasheet and sample codes for them. But problem is sample codes are seems written for experienced users. Too many questions about why they initialized RS232, why they set 4th bit of port 1 and etc.,
Do you have a tips for reading or links where can I get info about how to read datasheets and sample codes of MCU?
I guess experience is the only answer I can give. Just like with programming in general, with time you acquire experience as well as learn buzz words and concepts. With microcontrollers you learn to read datasheets, schematics, etc. Learn about open drain, open collector, weak pull ups, etc. And for serial ports for some reason they are always overcomplicated. The hardest part with microcontrollers and the serial port is usually figuring out what to program to get the right clock divisors, some microcontroller serial ports are straight forward, others are overly complicated, some docs are good some docs are bad, etc.
Another answer is datasheets are always wrong. There are always gaps in the information that you have to hack to figure out. Do not write thousands of lines of code in a vaccuum using only a datasheet, write a small amount of code a few lines to a few dozen, test, and move on, you can get more lines written and debugged in a day when programming from a datasheet than the other path. The datasheets are often not written by the engineers that actually designed the hardware, sometimes it is a junior engineer or a non-engineer. Sometimes the information is simply wrong, sometimes the document is for a different but similar part than the one you have. If they provide software that actually does stuff it is sometimes (not always) more accurate than the datasheet (when I say datasheet assume the users manual, programmers reference manual, whatever the vendor calls the doc with the registers, addressses, and bit definitions for the hardware).
With time and experience you may find, if you take a wide enough view, that some vendors tend to do a better job at providing information to users, others do not, some bury the secrets in libraries, sometimes in binary form and not source. Sometimes the secrets are buried in compilers and other tools they provide (well that is back to apis and libraries). I tend to blacklist such companies, but sometimes you cant always. ARM for example does a very good job of providing the information. the problem is they have so many cores with a number of options each, that are very similar in nature (support the same instruction sets) that it can be difficult to sort through what the one processor you are using that moment does and does not from the docs. Atmel, something about atmel that is hard to put a finger on, the docs are generally well above par, but more than that something about atmel makes them popular with the customers. You will never see an arduino like following, culture, pick a word, with a microchip pic for example. There are a lot of pic followers but it is not like the atmel world (which was there well before the arduino thing happened).
Another note, you might not understand with a single example program and single datasheet the history of a product, there might be code that has been used for a number of chip generations, and there might for example be a bit that is required by an older chip or newer chip and to share the same code that bit is manipulated. that bit might make sense looking at one datasheet and no sense looking at another. this is where hacking comes, in try it without, see what happens. maybe study other parts in the family that this code is said to support it might make more sense.
google is your friend or whatever favorite search engine, find as much open source code and other items for the particular device or whatever. At this level hacking is required, I dont use that term in the bad sense, hacking in the sense that you have to try some of the bits documented in the datasheet, see if that actually works, if not then see what it does if possible, look at other source code and see from that if you can figure it out. Just like there is no perfect car that gets infinite miles per gallon, completely safe, lasts forever, and is inexpensive, there is no perfect chip with the perfect datasheet and sample code. If you want to work at this software/hardware level you have to get your hands dirty, have to not be afraid to let some smoke out of the chips (there is a finite amount of smoke in a chip if you let even a little bit out it wont work), etc.
If the reason you wont ask specifically about the mcu or register you are working with is because it is closed source products or behind an NDA then you probably have access to the company that makes that product and you should be able to get support from them. Usually better support than you would get from a company that you dont have to sign an NDA for. Not that open document, open source companies are bad, just that if the company you buy from is interested in you to the point of showing internally protected information they are interested enough to give you better access to the real engineers that made/know the product. If this is not the case and you are able to talk about it, dont be afraid to just post a question to SO about the register and bits you are wondering about.
Sample code and flow charts in the MCU datasheets are good starting point to initialize a specific peripheral (like RS232).
You just start from there, and track the bit information and what it does, in MCU datasheet.

Robotics Club Programming Portion

My school has entered into a Robotics Tournament that competes several schools against each other(this is my school's first year). The objective of the robot is to shoot a ball into a hoop. I am a member of the Programming team. Our job as the programmers is to program a robot and a computer to control the robot. The computer has 2 joy sticks attached to it, one for moving the entire robot(spinning the wheels and causing the robot to move) and one is for the "throwing arm". A signal is going to be sent from the computer to the robot using wifi. All of the programming MUST be done in LabView.
I have never heard of LabView before until i joined this club and i have my doubts about it. The reason why we must use LabView is because most of the kids on the programming team have no programming experience whatsoever. LabView has to be able to interface with the joy sticks and then send that information to the robot using wifi. The micro controller on the robot supports LabView.
Now to my question, is LabView dynamic enough to preform this task? Can LabView even support networking? Can LabView even interface with the joy sticks? I have read a lot of the documentation for LabView from this website:
http://www.ni.com/gettingstarted/labviewbasics/environment.htm
My concern is that LabView is not dynamic enough for what we are trying to use it for as a team and we are going to have to program the computer and the micro controller using C. There are only 2 people on the team who can program sufficiently in C so we would have to teach the rest of the members the basics of C.
All relevant answers are welcomed and appreciated.
LabVIEW can totally do this. I am biased: I've written a textbook on it and am teaching classes:-); I also do this for a living. In comparision to C, well, C can do anything, but LabVIEW does hardware on a much higher level. Doesn't mean I don't like bending pointers for a bit; but it's nice to not care about low-level functions for a while.
Interfacing a joystick is pretty simple, it looks like this: http://digital.ni.com/public.nsf/allkb/CA411647F224787B86256DD000669EFE
To interface Wifi, it depends on how the robot should receive the information. TCP/IP would go like this: http://zone.ni.com/devzone/cda/tut/p/id/2710
I'm not sure what you mean by "dynamic enough", but it's certainly possible to create such a system in LabVIEW, and if the users have no experience, they're probably more likely to succeed if they use LV and they're probably going to enjoy it more. There are certainly many groups who use LabVIEW.
There are people who volunteer as mentors for FRC groups, so I would suggest you ask FIRST or your local NI office if they know of anyone (whether C or LV) who can help your group. If you ask NI, they might also be able to help you in other ways.
There's also a similar discussion here - Textual versus Graphical Programming Languages
The web page you provided is very introductory, "Hello world!" like. Just by learning that you can not get an idea about the LabVIEW potential. Sure you can do everithing with C, but with LabVIEW you will make the same task faster and I don't think you will need more than 2 team members working on the program.
LabView is dynamic, especially for the purposes of robotics club. A white paper outlines some of the possibilities. http://www.ni.com/white-paper/14133/en/
A great resource for people participating in robotics club is the Raspberry Pi website and blog. It is an excellent site to discover what others are doing and creating.
Consider studying the potential of robotics arms and reading white papers from companies that develop them for purposes such as laboratory automation. This is information that could help you if you decide to do this as a career. http://www.hudsonrobotics.com/products/microplate-handling/

Are there any current non-Harvard architecture microcontrollers?

I have used and like the Atmel ATMEGA and ATTINY series microcontrollers, and think them quite good. One thing I am not terribly fond of though is the fact that they (and Microchip PIC uC family also) are all Harvard machines, meaning I can't really put external memory to use or execute out of RAM, only the flash.
While there are obvious advantages to this design, it makes it technically very difficult to do things like FORTH using an AVR or PIC. (I know there is at least one implementation, but it does not work like a normal FORTH and will wear out the flash rather rapidly)
FORTH was originally created for interactive machine control type systems where lots of flexibility was needed, so things like the Z80 or 6809 were used as microcontrollers with the control program executing out or RAM or some other storage device.
Does anyone know of current devices of similar complexity (preferably available in DIP packages) to the AVR/PIC that are von Neumman machines?
In addition to Freescale processors (that starblue has already pointed out), the Texas Instrument MSP430 family uses von Neumann architecture. However only the smallest ones are available in a DIP package.
UPDATE to include PIC32:
In my original post, I had forgotten that PIC32 microcontrollers have always been able to execute out of RAM, as demonstrated by this code example;and now Microchip has come out with the new PIC32MZ line of microcontrollers, with up to 2 MB of Flash and 512K of RAM which makes them feasible for fairly large RAM-based programs. Unfortunately none of them chips are available in DIP packages.
However Olimex, sort of the Bulgarian equivalent of SparkFun and Adafruit, has a PIC32-HMZ144 development board for $21.95 EUR, which is about $24. This is a smoking hot deal since the processor alone costs over $12 at Digi-Key. (There are other boards available from US suppliers from around $50 and up.)
The original PIC32MX line has twenty variants in 28-pin DIP packages, but they are limited to a maximum of 64K of RAM, still useful for some projects.
Farnell has a nice search function that let's you search for microcontrollers in DIP packages. Though you'll have figure out which families are non-Harvard by looking at the data sheets.
Take a look at the 68K ones and the HCS08.
Update: In the meantime some ARM Cortex-M controllers in DIP packages have become available, the LPC810M021FN8 and the LPC1114FN28 from NXP.
You might want to peruse the designs available at the OpenCores project. That is an open source project devoted to CPU core designs implemented in VHDL, Verilog, and similar FPGA design languages. There are complete and respectable implementations of classic 8-bit CPUs such as the 8080, 6502, and 8051. The 6502 I linked to claims to be cycle-accurate compared to the original chip. Others are functionally complete, but often have more modern buses and signals.
They won't (I think) be available in DIP packages, but you can always find breakout boards.
The designs are all open source, under a wide variety of licenses.
You may also have a look at the Zilog eZ80. Since they're binary-compatible with the old Z80, you should be able find a FORTH implementation that runs on them, but you'd probably need to run it on top of good old CP/M :)
Also, these are the only ones that I found that have the memory bus accessible from the outside, i.e. allow code execution from external memory.
The arm based ones, even the cortex-m3 claims to be harvard, but you can load programs into data ram and execute from that ram. it is really not harvard. Other arms are normally not harvard, some have external memory interfaces you can use to expand the internal resources.
This is actually not a question, but more of a related query. Why would you go to von-neumann in a microcontroller if the previous generation was harvard? Isnt it all win-win in terms of performance? other than complexity (which if the original PIC's can handle it, should not be that great) what are the downsides of having Harvard architecture?
The new Kinetis line of microcontrollers from Freescale puts an ARM Cortex-M4 inside a microcontroller package, and program code can be located anywhere in addressable space (RAM or FLASH, or even Flex Memory.)
The Kinetis Solution Advisor is a powerful selector guide that can help you find the micro you want. Memory from 32kB to 1MB, all the peripherals you could want, and pricing from under a dollar to around 10.

Programming microcontroller to store images and displaying them as a dia-show with an dvi/hdmi output in several resolutions?

I would like to solder a microcontroller, control buttons and an DVI/HDMI output and program this in a way, that I can store images on it and let them display as a dia-show via the outputs.
It doesn't have to have a lot of storage capacity, 128Mb would be enough.
but I don't know how to start, because I haven't done anything like this before.
My aim is to present some important images to friends by just taking this hardware, connecting it to a TV screen and showing these photos. If should be able to switch the photos manually (using a button) or automatically in a dia-show.
It should support several TV resolutions and it should be connectable to my PC (USB prefered), so that I can upload and delete photos.
So where to start and how to do that?
Thank you in advance, Andreas
If your aim is just to show some photos, there are assuredly simpler and more cost effective ways to do so; devices exist which do more or less exactly what you are proposing.
If your aim is to learn about microcontrollers and this is a project your are taking up to further that, I would recommend looking into the Arduino: http://www.arduino.cc/ or a similar kit based micro, and growing your project from that.
Microcontroller + low level language will be a huge pain to work with, particularly if you wish to handle various file formats and screen resolutions. Get a full-blown computer with an OS instead - something like http://en.wikipedia.org/wiki/PC/104
If your goal is purely to be able to display photos then I would recommend using a digital camera with video out capabilities.
If your aim is to learn about electronics and microcontrollers I would start with a good book and an Arduino board. Note that writing microcontroller code to handle file systems, image formats and video output is non-trivial. Simpler projects may be a better starting point as they are more accessible resulting in quicker progress, less frustration and more motivation!
The engineering field is a interesting field. You can start with the web site "www.microchip.com". You will need a high end device consider the PIC32MX795L512, there is a nice starter kit for it, "Ethernet Starter Kit" http://www.microchip.com/stellent/idcplg?IdcService=SS_GET_PAGE&nodeId=2615&dDocName=en545713. This kit has the on board debugger & programmer to do all the hard work.
You get sample projects with the package, you can program using ansi c programming.
IDE : MPLAB which is free, and the C32 compiler has a student/lite version.
Arduino also has a board with the same device.
I personally like "www.techtoys.com.hk", they have device compatible with Microchip boards like techtoys.com.hk/PIC_boards/PIC32STK%20SSD1963%20EVK/PIC32STK%20SSD1963%20EVK%20R1A.htm, or this techtoys.com.hk/PIC_boards/PIC2432EVK-RD4/PIC2432%20EVK%20RD4.htm where this board you will need a debugger/programmer like the low cost PIC Kit 3 "microchip.com/pickit3".
The trouble is you need to write the HDMI video library yourself, there are some VGA libraries available but they are only black and white and very hard to get color with these analog images. The rest of the libraries are already there, USB MSD(flash drive), SD Card, pictures (jpg) etc.
microchip.com/stellent/idcplg?IdcService=SS_GET_PAGE&nodeId=2680&dDocName=en547784
Feel free to contact me if you need some help, I might be able to help with the HDMI library.
It's a lot of fun to play with these toys.
Regards
Lucas
B-Eng Digital Engineering.
imlucanio#yahoo.com (no spamming)
Remember to add the http and www to the web links.
It sounds like you want an iPod. That is a dead simple thing to work with and it does everything you want. Otherwise, very complicated. I'd suggest the BeagleBoard and embedded Linux. Yes, it warrants that level of complexity.
The options for small microcontrollers just aren't there. The Arduino is very popular and yes, you can interface an SD card to it. That'd be your storage. Yes, you can put a digital potentiometer on it, that can be your interface. I've seen some video overlays that do simple text, but never any JPEG display (too much processing required). And certainly no 24 bit color (so that the output would actually look good) - that would take WAY too many pins to do correctly (and the Arduino doesn't have a D/A converter! You'd have to rig something up that would suck). And even then, all of the options for TV out weren't HDMI, but RCA (the old red/white/yellow cables).
So in short, no. Get a computer. That's what can do the job.

Resources