File: ai.html

package info (click to toggle)
lg-issue19 4-4
  • links: PTS
  • area: main
  • in suites: woody
  • size: 1,388 kB
  • ctags: 122
  • sloc: sh: 73; makefile: 37
file content (332 lines) | stat: -rw-r--r-- 12,886 bytes parent folder | download | duplicates (3)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
<!--startcut ==========================================================-->
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2//EN">
<HTML>
<HEAD>
<title>Linux and AI Issue 19</title>
</HEAD>
<BODY BGCOLOR="#f7f7f7" TEXT="#000000" LINK="#0000FF" VLINK="#007353"
ALINK="#FF0000">
<!--endcut ============================================================-->

<H4>
&quot;Linux Gazette...<I>making Linux just a little more fun!</I>&quot;
</H4>

<P> <HR> <P> 
<!--===================================================================-->

<center>
<H2>Linux and Artificial Intelligence</H2>
<H4>By John Eikenberry,
<a href="mailto:jae@ai.uga.edu">jae@ai.uga.edu</a></H4>
</center>
<P><HR>


    Three years ago when I was starting my last year of my masters of
    philosophy degree.  I found myself asking that eternal question,
    "Ok, now what in the hell am I going to do?" Not wanting to
    continue on in philosophy, what could a philosopher (and computer
    enthusiast) do that would be both fun and profitable. Artificial
    Intelligence of course (but you saw that coming didn't you?)
    <P></P>

    I had fallen in love with Linux in late 1993 and after seeing all
    the Suns scattered about the AI Dept, it seemed like the perfect
    OS for AI research. Guess what, I was right. I have found so many
    resources available for doing AI research on Linux that I had to
    write them all down (warning: blatant plug follows), thus my <a
    href="http://www.ai.uga.edu/students/jae/ai.html">Linux AI/Alife
    mini-HOWTO</a> came into being.  <P></P>
      
    Ok, enough of this drivel, now on to the meat of the article.
    <P></P>
      
    Modern AI is a many faceted field of research, dealing with
    everything from 'traditional' logic based systems, to
    connectionism, evolutionary computing, artificial life, and
    autonomous agents. With Unix being the main platform for AI, there
    are many excellent resources available for Linux in each of these
    areas. The rest of this article I'll give a brief description
    of each of these areas along with one of the more interesting
    resources available to the Linux user.  <P></P>

    <hr>
    
    <DL>

      <DT><b>PROGRAMMING LANGUAGES</B>
	
      <DD>
	I know I didn't mention this above, but there are many
	programming languages that have been specifically designed
	with AI applications in mind.
      </DD>
      
    <P></P>

      <DT>DFKI OZ<BR>
	Web page: <A HREF="http://www.ps.uni-sb.de/oz/">www.ps.uni-sb.de/oz/</A><BR>
	FTP site: <A HREF="ftp://ps-ftp.dfki.uni-sb.de/pub/oz2/">ps-ftp.dfki.uni-sb.de/pub/oz2/</A>
	
      <DD>
	Oz is a high-level programming language designed for
	concurrent symbolic computation.  It is based on a new
	computation model providing a uniform and simple foundation
	for several programming paradigms, including higher-order
	functional, constraint logic, and concurrent object-oriented
	programming.  Oz is designed as a successor to languages such
	as Lisp, Prolog and Smalltalk, which fail to support
	applications that require concurrency, reactivity, and
	real-time control.  <P></P>

	DFKI Oz is an interactive implementation of Oz featuring a
	programming interface based on GNU Emacs, a concurrent
	browser, an object-oriented interface to Tcl/Tk, powerful
	interoperability features (sockets, C, C++), an incremental
	compiler, a garbage collector, and support for stand-alone
	applications.  Performance is competitive with commercial
	Prolog and Lisp systems.
      </DD>
    </DL>
    
    <hr>

    <DL>

      <DT><B>TRADITIONAL ARTIFICIAL INTELLIGENCE</B>
	
      <DD>
	Traditional AI is based around the ideas of logic, rule
	systems, linguistics, and the concept of rationality.  At its
	roots are programming languages such as Lisp and Prolog.
	Expert systems are the largest successful example of this
	paradigm.  An expert system consists of a detailed knowledge
	base and a complex rule system to utilize it.  Such systems
	have been used for such things as medical diagnosis support
	and credit checking systems.
      </DD>
      
    <P></P>

      <DT>SNePS<BR> 
	Web site: <A
	HREF="http://www.cs.buffalo.edu/pub/sneps/WWW/">www.cs.buffalo.edu/pub/sneps/WWW/</A><BR>
	FTP site: <A
	HREF="ftp://ftp.cs.buffalo.edu/pub/sneps/">ftp.cs.buffalo.edu/pub/sneps/</A>
	
      <DD>
	The long-term goal of The SNePS Research Group is the design
	and construction of a natural-language-using computerized
	cognitive agent, and carrying out the research in artificial
	intelligence, computational linguistics, and cognitive science
	necessary for that endeavor. The three-part focus of the group
	is on knowledge representation, reasoning, and natural-language
	understanding and generation. The group is widely known for its
	development of the SNePS knowledge representation/reasoning
	system, and Cassie, its computerized cognitive agent.
      </DD>
    </DL>
    
    <hr>

    <DL>

      <DT><B>CONNECTIONISM</B> 

      <DD>
	Connectionism is a technical term for a group of related
	techniques. These techniques include areas such as Artificial
	Neural Networks, Semantic Networks and a few other similar
	ideas. My present focus is on neural networks (though I am
	looking for resources on the other techniques). Neural
	networks are programs designed to simulate the workings of the
	brain. They consist of a network of small mathematical-based
	nodes, which work together to form patterns of information.
	They have tremendous potential and currently seem to be having
	a great deal of success with image processing and robot
	control.
      </DD>

    <P></P>
      
      <DT>PDP++<BR>
	Web site: <a href="http://www.cnbc.cmu.edu/PDP++/">www.cnbc.cmu.edu/PDP++/</a><br>
	FTP site (US): <a href="ftp://cnbc.cmu.edu/pub/pdp++/">cnbc.cmu.edu/pub/pdp++/</a><br>
	FTP site (Europe): <a href="ftp://unix.hensa.ac.uk/mirrors/pdp++/">
unix.hensa.ac.uk/mirrors/pdp++/ </a>

      <DD>
	As the field of connectionist modeling has grown, so has the
	need for a comprehensive simulation environment for the
	development and testing of connectionist models. Our goal in
	developing PDP++ has been to integrate several powerful
	software development and user interface tools into a general
	purpose simulation environment that is both user friendly and
	user extensible. The simulator is built in the C++ programming
	language, and incorporates a state of the art script
	interpreter with the full expressive power of C++. The
	graphical user interface is built with the Interviews toolkit,
	and allows full access to the data structures and processing
	modules out of which the simulator is built. We have
	constructed several useful graphical modules for easy
	interaction with the structure and the contents of neural
	networks, and we've made it possible to change and adapt many
	things. At the programming level, we have set things up in
	such a way as to make user extensions as painless as
	possible. The programmer creates new C++ objects, which might
	be new kinds of units or new kinds of processes; once compiled
	and linked into the simulator, these new objects can then be
	accessed and used like any other.
      </DD>
    </DL>

    <hr>

    <DL>
      
      <DT><B>EVOLUTIONARY COMPUTING [EC]</B>

      <DD>
	Evolutionary computing is actually a broad term for a vast
	array of programming techniques, including genetic algorithms,
	complex adaptive systems, evolutionary programming, etc.
	The main thrust of all these techniques is the idea of
	evolution. The idea that a program can be written that will
	<i>evolve</i> toward a certain goal.  This goal can be
	anything from solving some engineering problem to winning a
	game.
      </DD>

    <P></P>

      <DT>GAGS<BR>
	Web site: <A HREF="http://kal-el.ugr.es/gags.html">kal-el.ugr.es/gags.html</A><BR>
	FTP site: <A HREF="ftp://kal-el.ugr.es/GAGS/">kal-el.ugr.es/GAGS/</A>

      <DD>
	Genetic Algorithm </a> application generator and class library
	written mainly in C++.
	<BR>
	As a class library, and among other thing, GAGS includes:
        <UL>
	  <LI>A <em>chromosome hierarchy</em> with variable length
	  chromosomes.  <em>Genetic operators</em>: 2-point crossover,
	  uniform crossover, bit-flip mutation, transposition (gene
	  interchange between 2 parts of the chromosome), and
	  variable-length operators: duplication, elimination, and
	  random addition.  
	  <LI><em>Population level operators</em> include steady
	  state, roulette wheel and tournament selection.
	  <LI><em>Gnuplot wrapper</em>: turns gnuplot into a
	  <code>iostreams</code>-like class.
	  <LI>Easy sample file loading and configuration file parsing.
        </ul>
	As an application generator (written in <code>PERL</code>),
	you only need to supply it with an ANSI-C or C++ fitness
	function, and it creates a C++ program that uses the above
	library to 90% capacity, compiles it, and runs it, saving
	results and presenting fitness thru <code>gnuplot</code>.
      </DD>
      
    </DL>

    <hr>

    <DL>

      <DT><b>ALIFE</b>
	
      <DD>
	Alife takes yet another approach to exploring the mysteries of
	intelligence.  It has many aspects similar to EC and
	connectionism, but takes these ideas and gives them a
	meta-level twist. Alife emphasizes the development of
	intelligence through <i>emergent</i> behavior of <i>complex
	adaptive systems</i>.  Alife stresses the social or group
	based aspects of intelligence. It seeks to understand life and
	survival. By studying the behaviors of groups of 'beings' Alife
	seeks to discover the way intelligence or higher order
	activity emerges from seemingly simple individuals. Cellular
	Automata and Conway's Game of Life are probably the most
	commonly known applications of this field.
      </DD>

    <P></P>

      <DT>Tierra<BR>
	Web site: <A HREF="http://www.hip.atr.co.jp/~ray/tierra/tierra.html">www.hip.atr.co.jp/~ray/tierra/tierra.html</A> <br>
	FTP site: <A HREF="ftp://alife.santafe.edu/pub/SOFTWARE/Tierra/">alife.santafe.edu/pub/SOFTWARE/Tierra/</A><BR>
	Alternate FTP site: <a href="ftp://ftp.cc.gatech.edu/ac121/linux/science/biology/">ftp.cc.gatech.edu/ac121/linux/science/biology/</a>

      <DD>
	Tierra's written in the C programming language. This source
	code creates a virtual computer and its operating system,
	whose architecture has been designed in such a way that the
	executable machine codes are evolvable. This means that the
	machine code can be mutated (by flipping bits at random) or
	recombined (by swapping segments of code between algorithms),
	and the resulting code remains functional enough of the time
	for natural (or presumably artificial) selection to be able to
	improve the code over time.
      </DD>

    </DL>

    <hr>

    <DL>

      <DT><B>AUTONOMOUS AGENTS</B>

      <DD>
	Also known as intelligent software agents or just agents, this
	area of AI research deals with simple applications of small
	programs that aid the user in his/her work. They can be mobile
	(able to stop their execution on one machine and resume it on
	another) or static (live in one machine). They are usually
	specific to the task (and therefore fairly simple) and meant
	to help the user much as an assistant would. The most popular
	(ie. widely known) use of this type of application to date are
	the web robots that many of the indexing engines
	(eg. webcrawler) use.
      </DD>

    <P></P>

      <DT>Ara<BR>
	Web site: <A HREF="http://www.uni-kl.de/AG-Nehmer/Ara/">www.uni-kl.de/AG-Nehmer/Ara/</A>

      <DD>
	Ara is a platform for the portable and secure execution of
	mobile agents in heterogeneous networks. Mobile agents in this
	sense are programs with the ability to change their host
	machine during execution while preserving their internal
	state. This enables them to handle interactions locally which
	otherwise had to be performed remotely. Ara's specific aim in
	comparison to similar platforms is to provide full mobile
	agent functionality while retaining as much as possible of
	established programming models and languages.
      </DD>
    </DL>

<!--===================================================================-->
<P> <hr> <P> 
<center><H5>Copyright &copy; 1997, John Eikenberry<BR> 
Published in Issue 19 of the Linux Gazette, July 1997</H5></center>

<!--===================================================================-->
<P> <hr> <P> 
<A HREF="./lg_toc19.html"><IMG ALIGN=BOTTOM SRC="../gx/indexnew.gif" 
ALT="[ TABLE OF CONTENTS ]"></A>
<A HREF="../lg_frontpage.html"><IMG ALIGN=BOTTOM SRC="../gx/homenew.gif"
ALT="[ FRONT PAGE ]"></A>
<A HREF="./hallways.html"><IMG SRC="../gx/back2.gif"
ALT=" Back "></A>
<A HREF="./program.html"><IMG SRC="../gx/fwd.gif" ALT=" Next "></A>
<P> <hr> <P> 
<!--startcut ==========================================================-->
</BODY>
</HTML>
<!--endcut ============================================================-->