reading touchscreen

Farzad
hi
i want to know, how the miny6410 mother board reads touch screen by 1-wire
. in the other hand, details about 1-wire connection between min6410 and
STC12LE4052 microcontroller, for reading touch
best regards
tanks

anatoly
Interesting question...
It seems the producer has closed it.
Moreover, the STC12LE4052 MCU is very fast, has embedded crypto, so one can
communicate with this beast using an ARM assembler, not C,C++,C#, Qt,
Python, not Linux or Windows, and even not an Android and Java :)
Hacking it with bare metal and C failed to shed a light on this protocol.

Farzad
ok
lets me to introduce ploblem, in other words.
i have a tiny6410 Stamp Module and i want to connect a touchscreen to it,
but i dont want to use from STC12LE4052 MCU. i want to read touchscreen
with an avr MCU and send result to tiny6410 Stamp Module.
is not any way??
what should i do?

anatoly
Understood, it's the best way.
From the touch panel take 4 voltage signals, YM, YP, XM, XP.
Connect them to MCU's ADC0...ADC3.
Pressing the panel you'll get four numbers. Just calculate them to produce
real screen coordinates. I hope it's easy: tap one of the most left margin,
then - the most right one. You'll get, for instance 908 and 54 ADC values
for XM and XP. Next, 908-54=854 will be the full scale along the X
coordinate. Dividing the screen X size (480 or other number, depending of
the LCD) by 854, you'll get X coefficient.
To obtain the proper X value, just subtract 54 from the measured ADC number
and multiply by the coefficient. The same for Y.
(Tip. People like floating point but you do so: first multiply, then do
division, all with integers).
Connect the MCU and tiny6410 via SPI, or I2C, or UART.

Also, you may use the 6410 SoC's ADC channels (traditional way).

And the last advice. Why you don't want to use the U6 chip on the LCD PCB?
I think it is TI ADS7843. It will do all the job for your needs: reads X,
Y, transfers them via serial bus and (!) - generate the IRQ. You may
connect it to one of the IRQ inputs and know in time was the screen pressed
or not.
 Just connect the ADS7843 and tiny6410.

Farzad
Hi dear anatoly
tanks for your guidance. its true
accroding to your statements, i can now read the x,y position from the
touch by AVR, its ok.
but i am using Windows CE on this board, and reading touchscreen cordinates
is a task of operating system layers. for this purpose a driver programm is
needed. whereas Windows CE has driver for reading touch with 1-wire
protocol and PCB that ads7843 is located on, is connecting to windows CE as
this way.
i want to konw, how can i send touch cordinates to tiny6410 with 1-wire.
what is the communication standard?

anatoly
Dear Farzad, did you read this:
http://en.wikipedia.org/wiki/1-Wire
I don't know exactly your case and what you're doing and plan.
I can suggest to look at the old computer mouse. WinCE must support them
well. The're ball mice, in every byte they transferred deltaX, deltaY and
buttons. You may get one, play with it and study its protocol. Connect it
via UART port. (My Wacom pen pad and many other input devices works the
same way).
I mean just simulate the mouse with well known driver by your custom AVR
MCU. As a bonus, you'll get the interrupt capabilities too.
It will be much simpler then to re-invent the 1-wire, which is unreliable
as some people said. Or, try to play with i-Button and tiny6410.

farzad
dear anatoly
tanks a lot for responsing to me
i now explain all of plan for you.
i have a mini6410 borad and run WinCE on it. another board that 7 inch lcd
and touchscreen are placed on it, connect via a flat cable to mini6410
on this board exist a ADS7843 and STC12LE4052 MCU, that read the touch and
send cordinate via 1-wire to mini6410. ok?
its true.
now i have bought a tiny6410 stamp board. in the board that i have
designed, lcd is connected to tiny6410 directly. but i dont want to connect
touchscreen direct to it, because it isnt precise. and i dont want to use
the explaind above board(consist ad7843 and STC12LE4052), i read touch
cordinates with AVR and i want to send it to tiny6410, so that WinCE
identify

farzad
dear anatoly
tanks a lot for responsing to me
i now explain all of plan for you.
i have a mini6410 borad and run WinCE on it. another board that 7 inch lcd
and touchscreen are placed on it, connect via a flat cable to mini6410
on this board exist a ADS7843 and STC12LE4052 MCU, that read the touch and
send cordinate via 1-wire to mini6410. ok?
its true.
now i have bought a tiny6410 stamp board. in the board that i have
designed, lcd is connected to tiny6410 directly. but i dont want to connect
touchscreen direct to it, because it isnt precise. and i dont want to use
the explaind above board(consist ad7843 and STC12LE4052), i read touch
cordinates with AVR and i want to send it to tiny6410, so that WinCE
identify it. whats your idea about this problem??
what should i do??? i am just confused

anatoly
Dear Farzad,
I don't have a deal with WinCE.
I can only advise you to use that interface, that you can. You know better
what is good for you. Some alternatives are:
1. You may use standard connection. Just remove four 0R resistors (wires)
from the ADS7843 and connect them to SoC ADC channels. All is explained in
PCB pdf docs for LCDs.
2. Connect AD7843 output to SoC via I2C bus. 
3. Use one of UART channels. Try some pointing device first (an old mouse).
I think WinCE must work with a mouse. Then emulate mouse protocol using an
AVR MCU.
You can apply COM, USB mice, trackpads, planshets, Wii nunchaks, Android
phones's touch screens or G-sensors, Bluetooth mice as your own pointing
devices.

Just load the X Y onto the AVR by any way and translate them into 6410 by
the UART, or SPI, or I2C protocols.

anatoly
Indeed, Assembler is not needed to have a deal with STC12LE4052
microcontroller.

pradeep kumar k
Hi Peers, 
      I am developing a project in friendly arm 9....please do help me for
the following things ...
Here is my project description..
             This smart robot will be able to do the following receptionist
chores on its own. The robot scans the space before it using sensors and if
it finds a person moving it greets them with a welcome message in a clear
MP3 quality voice. Next the robot will separate the visitors into regular
employee or a general visitor. It shows the greetings image and then
renders an onscreen keypad with letters and prompts the user to enter his
signature on the embedded touchscreen panel display if he is an employee or
just enter his name and the purpose of visit if the person is a general
visitor. The robot also assigns a visitor ID number to each of them. The
entered name along with the time will be saved in memory. 
After this process, the robot will present a list of names (ex: principal,
chairman, etc..,) whom the visitor want to meet or a list of places
(medical dispensary, seminar room etc..,) where the visitor want to go. It
asks the user to touch their choice. Once the user chooses the right option
on the screen, the system will acknowledge with voice and it shows a route
map of that building for the user to reach his destination spot or the
place where the respective official would be normally available. Before
this process more importantly it sends the visitor request through a
wireless network to other slave units fixed near the official persons that
will display the visitor name and his purpose. The official can communicate
with the robot (call or hold) using the push buttons on his device and the
robot will announce the visitor ID audibly and manage them in the
reception. Pressing a button on the robot will cause the system to show the
signature employee names along with the corresponding time. The robot is
also able to handle multiple languages and the user will be guided in his
language of choice.

 please send your ideas to my mail id...pradeep22kumar.k@gmail.com