<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.luntti.net/index.php?action=history&amp;feed=atom&amp;title=Line_follower_proportional_py_v2%2Fen</id>
	<title>Line follower proportional py v2/en - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.luntti.net/index.php?action=history&amp;feed=atom&amp;title=Line_follower_proportional_py_v2%2Fen"/>
	<link rel="alternate" type="text/html" href="https://wiki.luntti.net/index.php?title=Line_follower_proportional_py_v2/en&amp;action=history"/>
	<updated>2026-04-09T04:47:08Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.40.0</generator>
	<entry>
		<id>https://wiki.luntti.net/index.php?title=Line_follower_proportional_py_v2/en&amp;diff=1350&amp;oldid=prev</id>
		<title>Bette183114776: Updating to match new version of source page</title>
		<link rel="alternate" type="text/html" href="https://wiki.luntti.net/index.php?title=Line_follower_proportional_py_v2/en&amp;diff=1350&amp;oldid=prev"/>
		<updated>2020-09-23T15:00:32Z</updated>

		<summary type="html">&lt;p&gt;Updating to match new version of source page&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;&amp;lt;languages /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
&lt;br /&gt;
=== Robot ===&lt;br /&gt;
&lt;br /&gt;
The idea and principle works for almost any robot thought this is tested using Asimov.&lt;br /&gt;
&lt;br /&gt;
=== Sensors ===&lt;br /&gt;
&lt;br /&gt;
The color sensor in reflected light intensity mode is used.&lt;br /&gt;
The sensor convention is&lt;br /&gt;
#port 1 = touch, &lt;br /&gt;
#port 2 = gyro, &lt;br /&gt;
#port 3 = color, &lt;br /&gt;
#port 4 = infrared or ultrasonic&lt;br /&gt;
&lt;br /&gt;
== An Illuminating Example ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;youtube&amp;gt;q8XD_El4DEI&amp;lt;/youtube&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Theory ==&lt;br /&gt;
&lt;br /&gt;
The proportional line follower actually follows the other side of the line. The turning radius is calculated using the minimum, maximum and current color sensor readings. Also, a proportional coefficient (P) is introduced. It should be noted that the values of the steering function needs be between -100 and +100. Thus, we employ Python Max and Min function.&lt;br /&gt;
&lt;br /&gt;
== An Example Code ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;python&amp;quot;&amp;gt;&lt;br /&gt;
#!/usr/bin/env python3&lt;br /&gt;
# https://sites.google.com/site/ev3devpython/&lt;br /&gt;
&lt;br /&gt;
#Sensor port convention:&lt;br /&gt;
#port 3 = color&lt;br /&gt;
#port 1 = touch, port 2 = gyro, port 3 = color, port 4 = infrared or ultrasonic.&lt;br /&gt;
#84 is Max&lt;br /&gt;
#30 is Min&lt;br /&gt;
&lt;br /&gt;
from ev3dev2.sensor.lego import ColorSensor&lt;br /&gt;
from ev3dev2.motor import MoveSteering, OUTPUT_B, OUTPUT_C&lt;br /&gt;
from time import sleep&lt;br /&gt;
import os&lt;br /&gt;
&lt;br /&gt;
os.system(&amp;#039;setfont Lat15-TerminusBold32x16&amp;#039;) &lt;br /&gt;
&lt;br /&gt;
steer_pair = MoveSteering(OUTPUT_B, OUTPUT_C)&lt;br /&gt;
steer_pair.on(steering=0, speed=10)&lt;br /&gt;
&lt;br /&gt;
cl = ColorSensor() &lt;br /&gt;
clMax = 84&lt;br /&gt;
clMin = 30&lt;br /&gt;
clAve = (clMax + clMin)/2&lt;br /&gt;
P = 2.0&lt;br /&gt;
&lt;br /&gt;
clN = clAve&lt;br /&gt;
steering = 0&lt;br /&gt;
&lt;br /&gt;
while True:&lt;br /&gt;
    clN = cl.reflected_light_intensity&lt;br /&gt;
    #print( clN )&lt;br /&gt;
    #print( clAve )&lt;br /&gt;
    steering = P*( clN - clAve )&lt;br /&gt;
    steering = min(steering, 100)&lt;br /&gt;
    steering = max(steering, -100)&lt;br /&gt;
    print( steering )&lt;br /&gt;
    steer_pair.on(steering=steering, speed=20)&lt;br /&gt;
&lt;br /&gt;
steer_pair.off()&lt;br /&gt;
sleep(5)&lt;br /&gt;
&amp;lt;/syntaxhighlight &amp;gt;&lt;br /&gt;
&amp;lt;youtube&amp;gt;6j8KsGABPdU&amp;lt;/youtube&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Exercises ==&lt;br /&gt;
&lt;br /&gt;
1. It is difficult to debug the robot as it is silent. Let it say if it is in dark or light area. Use [https://python-ev3dev.readthedocs.io/en/ev3dev-stretch/sensors.html#color-sensor Sound.speak(&amp;#039;White&amp;#039;).wait()] command or e.g. sound.beep() command. Sound is imported using import Sound command.&lt;br /&gt;
&lt;br /&gt;
2. Make the robot move faster. Note that you need to change the parameters according to your line to follow (and robot). Generally, it is advised to change only one value at time. Time your original time and try to make it half.&lt;br /&gt;
&lt;br /&gt;
3. Let the robot use the other side of the line.&lt;br /&gt;
&lt;br /&gt;
4. Now the while loop is forever. Make the robot stop when the right has turned 3.4 revolutions. See [https://sites.google.com/site/ev3devpython/learn_ev3_python/using-motors ev3 Python] for help.&lt;br /&gt;
&lt;br /&gt;
5. Let the robot end when it encounters a silver tape (highly shiny).&lt;br /&gt;
&lt;br /&gt;
6. The Rescue is robot game such that it needs to follow the dashed line. So, make a line with a segment missing, but make your robot to still overlap the missing segment and follow the line on the other side of the missing segment.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category: Python v2]]&lt;br /&gt;
[[Category: Line follower]]&lt;br /&gt;
[[Category: Color sensor py v2]]&lt;br /&gt;
[[Category: Asimov]]&lt;br /&gt;
&lt;br /&gt;
== About == &lt;br /&gt;
&lt;br /&gt;
This course is supported by [https://meet-and-code.org/ Meet and Code]. The course is made in collaboration with [http://www.fllsuomi.org/ Robotiikka- ja tiedekasvatus ry].&lt;br /&gt;
[[File:MeetAndcodeLogo.png|thumb]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Meet_and_Code_2020_II:_Python | Meet and Code II: Python]]&lt;/div&gt;</summary>
		<author><name>Bette183114776</name></author>
	</entry>
</feed>