Programming |OT| C is better than C++! No, C++ is better than C

Self-teaching programming has always seemed like a really bad idea to me. I'd be interested in hearing success stories, but from my -- probably biased -- perspective, it seems like developing bad habits would be so damn easy. Programming itself has always seemed like a means to an end to me, rather than the end itself.
If you learn from good books and online references then it really isn't much of an issue. You just need something that doesn't go "Here's the syntax, go with it!" but something that actually details convention and then explains WHY programmers tend to do it that way in terms of the benefits to you.
 
If you learn from good books and online references then it really isn't much of an issue. You just need something that doesn't go "Here's the syntax, go with it!" but something that actually details convention and then explains WHY programmers tend to do it that way in terms of the benefits to you.
Yeah, for example there are books about data structures and algorithms specifically. I have one called just that, "Data Structures and Algorithms in C++". It's usually a good idea to read through one of those once you read through the book that's teaching you the language. And there are books like "How to write better code" and such. And one of the best things to do is to read the source code for big open source projects and try to understand why it's written the way it is.

It's of course also good to have someone that one can talk to about the subject. Joining an open source project and learning from the guys that are better than you, and teaching the guys that are not as good as you is also a great way to learn because you can bounce ideas off of each other. All it boils down to is that one needs to put the time and effort into it, like with anything.
 
Usually JavaScript
Or an AJAX call, but not good practice for that
Actually, it's typically both.

Scrolling observer notified we reached the end of currently loaded tweets -> bottom element is changed to loading message -> ajax call sent.

data retrieved -> callback fired (whether success or failure there will be some callback)
if success
append results to currently loaded tweets
else if failure
change loading message to error message (error getting tweets and those types of deals)
 
On the C vs C++ debate, this says it all:

http://harmful.cat-v.org/software/c++/linus

For those who don't know, Linus Torvalds is the creator and maintainer of the Linux kernel.


It is amazing how you never stop learning how to program, my major/research as a grad student requires me to learn all set of language extensions and libraries to be able to do parallel programming efficiently. I wish people would just use functional languages, like Haskell, where the parallelism is inherit in the language itself but oh well...

I hope I can't be traced to who I'm really are IRL, because I really wanna talk about what I'm doing -or trying to do- without getting caught due to all the NDAs. But it involves FDTD, Intel and rhymes with spike.

----

Kinda related to the C vs C++ and I hope it catches on, object oriented programming as the empty calories of programming. You write a lot of code that really does nothing. Any program you can think of can be written in less lines if you avoid OOP. What they should be teaching in CS programs instead is functional programming. It is a godsend when complex programs can be written in so few lines of code.


For a beginner, is it best to start C or C#?
Neither. Do yourself a favor and learn Haskell. You will thank me.

This is a complete Haskell program in 2 lines:
factorial 0 = 1
factorial n = n * factorial (n - 1)

try that in any other language and you will need at least 2x the number of lines, as the program gets more complex it is easier to do in Haskell even more so compared to other languages.
 
On the C vs C++ debate, this says it all:

http://harmful.cat-v.org/software/c++/linus

For those who don't know, Linus Torvalds is the creator and maintainer of the Linux kernel.
Okay, but that isn't a reason to not learn C++. That guy has very refined tastes in terms of programming, and there is a great need for C++ programmers. Seeing as Java, C#, Python, all have quite a few similiarities there is no reason to avoid learning C++. I understand the guy is revered in the industry but I try to take these "tech god opinions" with a grain of salt. They do not represent the average programmer and would rather see the average programmr go extinct.

Kinda related to the C vs C++ and I hope it catches on, object oriented programming as the empty calories of programming. You write a lot of code that really does nothing. Any program you can think of can be written in less lines if you avoid OOP. What they should be teaching in CS programs instead is functional programming. It is a godsend when complex programs can be written in so few lines of code.

Neither. Do yourself a favor and learn Haskell. You will thank me.

This is a complete Haskell program in 2 lines:
factorial 0 = 1
factorial n = n * factorial (n - 1)

try that in any other language and you will need at least 2x the number of lines, as the program gets more complex it is easier to do in Haskell even more so compared to other languages.

There is no good reason to learn C++. I'd rather invest my time in Fortran which is highly relevant in high performance and parallel computing as is C. OpenMP and MPI supports two language C and Fortran, there is nothing more demanding than high performance computing. C++ maybe if you want to write in-house applications for some company? But it is really bad and it has been and continues to be a very bad language thru and thru. Believe me I unfortunately learned it.
Since when did writing less lines of code determine how good a language was. I thought it was all about how powerful your control was over the inner workings of the memory and logic. Obviously C or assembly would meet this but Python does a lot of the work for you and Haskell seems to do the same. Is there anything else that makes Haskell so great?
 
Okay, but that isn't a reason to not learn C++. That guy has very refined tastes in terms of programming, and there is a great need for C++ programmers. Seeing as Java, C#, Python, all have quite a few similiarities there is no reason to avoid learning C++. I understand the guy is revered in the industry but I try to take these "tech god opinions" with a grain of salt. They do not represent the average programmer and would rather see the average programmr go extinct.
There is no good reason to learn C++. I'd rather invest my time in Fortran which is highly relevant in high performance and parallel computing as is C. OpenMP and MPI supports two language C and Fortran, there is nothing more demanding than high performance computing. C++ maybe if you want to write in-house applications for some company? But it is really bad and it has been and continues to be a very bad language thru and thru. Believe me I unfortunately learned it.

Some C++ humor
http://harmful.cat-v.org/software/c++/
http://www.stgray.com/quotes/cppquotes.html

This is my favorite


I saw `cout' being shifted "Hello world" times to the left and stopped right there.
— Steve Gonedes


PS: I'm a computer scientist. I understand in the business world C++ is popular but with everything I know and believe in and with all my experiences, it is a badly designed language that forces you to write bad code. It is best to learn an elegant and clean language like Lisp, Prolog or Haskell. These languages don't require you to know too many keywords that you won't be able to understand until you understand advanced concepts, if you know the math, what you want the program to do and how it will be done, you can start writing a program immediately. From there you advance to languages such as C or Fortran, or Smalltalk and Java if you want/need OOP.
 
Writing less = good is usually a novice misconception. A great example I think of is LINQ in C#. You can do some insane list transforms that would take dozens of lines of code in a single one but it tends to get unreadable quickly and there is too much logic concentrated in a few statements that also makes it harder to debug. OO if anything as about clarity, you write more because the code explains itself through objects and methods, that's really what the work buys you. Functional programming languages are often powerful but can get confusing faster.

I do wish more people were exposed to functional programming though, it has some major benefits but we just don't have a lot of documentation on good practices.
 
Writing less = good is usually a novice misconception. A great example I think of is LINQ in C#. You can do some insane list transforms that would take dozens of lines of code in a single one but it tends to get unreadable quickly and there is too much logic concentrated in a few statements that also makes it harder to debug. OO if anything as about clarity, you write more because the code explains itself through objects and methods, that's really what the work buys you. Functional programming languages are often powerful but can get confusing faster.

I do wish more people were exposed to functional programming though, it has some major benefits but we just don't have a lot of documentation on good practices.
OOP is making comments mandatory by making them part of the program, it doesn't make anything simpler that can't be made even simpler in a non-OOP language.

How does functional programming lead to confusing code? Unless you don't understand math, it is almost 1-1 mapping. If a program written in a functional programming language is confusing then rethink the logic.
 
Yay. New thread! Subscribed. I'm starting my very first actual programming job today, so this should be fun. I hope.

As for books, am I the only one that still likes a lot of O'Reilly's library? I had to re-read Programming Perl to get re-acquainted with the language before this job interview.



I forgot how well written it was. It's just a joy to read as long as you're not averse to the occasional prideful chest-thumping of a Unix aficionado. It gets you through the most essential fundamentals of the language to get you up and running in one fantastic tutorial chapter, and then the rest is just a great reference for most every nook and cranny in the language.
 
I do wish more people were exposed to functional programming though, it has some major benefits but we just don't have a lot of documentation on good practices.
How does functional programming lead to confusing code? Unless you don't understand math, it is almost 1-1 mapping. If a program written in a functional programming language is confusing then rethink the logic.
Functional or merely procedural / structured with functions?
 
Do you guys ever feel that you'll never be good enough?

I'm doing a BSci degree in CS, and I've gotten high marks in my programming classes (Python, Java, and HTML/CS/Javascript/JQuery). But it feels like I'll never know enough to get a decent job.
 
Do you guys ever feel that you'll never be good enough?

I'm doing a BSci degree in CS, and I've gotten high marks in my programming classes (Python, Java, and HTML/CS/Javascript/JQuery). But it feels like I'll never know enough to get a decent job.
Best thing to do is constantly do projects for fun. Keep the source and notes about problems and how you solved them.
 
At work we use Coldfusion. Arrays start at 1. I feel so dirty.

I really need to start doing some non web stuff on the side so I don't forget everything I learned in school.
 
Do you guys ever feel that you'll never be good enough?

I'm doing a BSci degree in CS, and I've gotten high marks in my programming classes (Python, Java, and HTML/CS/Javascript/JQuery). But it feels like I'll never know enough to get a decent job.
Continued education is part of programming. As long as you are proficient with the basics and know a framework/toolkit or two, learning additional (or specialised) topics usually isn't a problem and goes pretty fast. There's no way a company can expect detailed knowledge of their internal software if they look for a (junior) programmer.
 
OOP is making comments mandatory by making them part of the program, it doesn't make anything simpler that can't be made even simpler in a non-OOP language.

How does functional programming lead to confusing code? Unless you don't understand math, it is almost 1-1 mapping. If a program written in a functional programming language is confusing then rethink the logic.
It think you misunderstand, I didn't say it makes it simpler, I'm arguing against that in a way. I'm explaining how it makes things understandable by logically separating them. The power of functional programming is things like curry functions which do get harder to read because now you are using a function as an argument to a function. You can write less code with it, but is less code the real desire or making it easy to understand and debug?

Bad use of either is problematic and you should consider the job before you pick your tools rather then thinking your mega tool can do it all. It's not the silver bullet you think it is.
 
It think you misunderstand, I didn't say it makes it simpler, I'm arguing against that in a way. I'm explaining how it makes things understandable by logically separating them. The power of functional programming is things like curry functions which do get harder to read because now you are using a function as an argument to a function. You can write less code with it, but is less code the real desire or making it easy to understand and debug?

Bad use of either is problematic and you should consider the job before you pick your tools rather then thinking your mega tool can do it all. It's not the silver bullet you think it is.
I'm on the bottom of the dev totem pole so it's usually my job to maintain or clean up after what the senior devs used to make their stuff. In most cases the original dev is long gone. Personally those that took the time to create legible and easy to follow code have my thanks (this is becoming increasingly rare). Those that used the latest automated tools or fanciful tricks, usually means playing detective to recreate the scene of that time period (typically this comes from contractors).
 
Do you guys ever feel that you'll never be good enough?

I'm doing a BSci degree in CS, and I've gotten high marks in my programming classes (Python, Java, and HTML/CS/Javascript/JQuery). But it feels like I'll never know enough to get a decent job.
The first bit of code I ever "wrote" (read: copied verbatim from a book) was some BASIC stuff on my dad's Commodore 128 when I was like 8. I barely had any idea what I was doing. I just liked computers and video games and my dad made me understand that this is how you tell a computer what to do.

By the time I was 12, I was tying to wrap my head around pointers in C, and by the time I started high school, I was trying to understand what this OOP in C++ was all about. High school and first-year university was basically me retreading old ground as I coasted through programming classes reaffirming the things I already taught myself. The last thing I remember learning from scratch in a class environment was 8086 ASM. I was fortunate to have a good professor, so that was fun, and as others have noted, it provided a good perspective for understanding what is happening with your code when you want to optimize things.

Then after first year, life happened (I blame women), and everything got sidetracked for other priorities. I'm 29 now, I've been in and out of university for the past 12 years, and I barely have anything to show for it aside from a general AA degree from the local college. In the interim, I've never really stopped programming though. I've picked up PHP (more than once), JScript, Java, C#, Perl, and bits and pieces of various other languages as the mood and the personal pet project has struck me. Hell, I'm staring down the barrel of having to pick up Ruby on Rails very quickly if I want to keep this new job.

I guess what I'm trying to get at, is that you're in a field where you probably will never know it all. That's absolutely normal. The idea is to learn and internalize as much as you can of the fundamental concepts and practices that underlie everything. The more you do that, the easier it's going to be for you to pick up any language, any architecture and get the job done.

Fundamentally, the reason why programmers passionately argue for their best & worst languages EVAR, are because there generally is a way to solve +95% of all coding problems in any language. Sure some languages are better at certain things, or clearer, or more concise than others, but at the core, it all boils down to very similar principles. So, then the language arguments often boil down to very personal preferences.

So, don't worry about it. As someone else said in here, just keep coding, and keep learning. That's what's kept me doing this my entire life, even when I thought the career was closed to me. There's always something new to learn, and some new way to think about an old problem. You should see that as part of the fun.
 
The first bit of code I ever "wrote" (read: copied verbatim from a book) was some BASIC stuff on my dad's Commodore 128 when I was like 8. I barely had any idea what I was doing. I just liked computers and video games and my dad made me understand that this is how you tell a computer what to do.

By the time I was 12, I was tying to wrap my head around pointers in C, and by the time I started high school, I was trying to understand what this OOP in C++ was all about. High school and first-year university was basically me retreading old ground as I coasted through programming classes reaffirming the things I already taught myself. The last thing I remember learning from scratch in a class environment was 8086 ASM. I was fortunate to have a good professor, so that was fun, and as others have noted, it provided a good perspective for understanding what is happening with your code when you want to optimize things.

Then after first year, life happened (I blame women), and everything got sidetracked for other priorities. I'm 29 now, I've been in and out of university for the past 12 years, and I barely have anything to show for it aside from a general AA degree from the local college. In the interim, I've never really stopped programming though. I've picked up PHP (more than once), JScript, Java, C#, Perl, and bits and pieces of various other languages as the mood and the personal pet project has struck me. Hell, I'm staring down the barrel of having to pick up Ruby on Rails very quickly if I want to keep this new job.

I guess what I'm trying to get at, is that you're in a field where you probably will never know it all. That's absolutely normal. The idea is to learn and internalize as much as you can of the fundamental concepts and practices that underlie everything. The more you do that, the easier it's going to be for you to pick up any language, any architecture and get the job done.

Fundamentally, the reason why programmers passionately argue for their best & worst languages EVAR, are because there generally is a way to solve +95% of all coding problems in any language. Sure some languages are better at certain things, or clearer, or more concise than others, but at the core, it all boils down to very similar principles. So, then the language arguments often boil down to very personal preferences.

So, don't worry about it. As someone else said in here, just keep coding, and keep learning. That's what's kept me doing this my entire life, even when I thought the career was closed to me. There's always something new to learn, and some new way to think about an old problem. You should see that as part of the fun.
I loved reading this. I love the Computer Science field and I can't wait to see where my eventual degree will take me. You are always learning, and I guess the best thing isn't to try to find the best language to learn on, rather just find a way to start learning.
 
So...

I'm teaching a structured programming course in the fall at the college I work at. It was kind-of thrust upon me recently and I don't have much time to prepare.

Previously the course was taught in COBOL, but I'm going to avoid that with a 10 foot pole if I can.

It seems like in the past the course has focused on file I/O and programs that used well-defined methods to avoid a spaghetti-code nightmare. They mainly generated reports of increasing complexity as the course progressed.

I'm a bit concerned for what I'll be coming up with for my students. It's going to be an interesting summer.
 
So...

I'm teaching a structured programming course in the fall at the college I work at. It was kind-of thrust upon me recently and I don't have much time to prepare.

Previously the course was taught in COBOL, but I'm going to avoid that with a 10 foot pole if I can.
I remember that thread. Did you pick a language?
 
One thing I really should learn at one point is web programming. Although I can code in Java and all that, it's not that practical outside of enterprise usage. Web apps seem like they'd be a practical extention to my skillset.

What should I be looking at? Javascript, php...anything else? I'd check out Ruby On Rails, but my web host doesn't support it. :(
 
I think this year I might try to take up some hardware projects...get myself an FPGA board and work on my VHDL skills. You can come up with some really neat and useful designs if you know how to work an FPGA board. Completely different mindset though, since FPGA programs run in full parallel. You really need to be comfortable with state machines to get anywhere. There is such a large sense of accomplishment from creating working FPGA based designs though. I made my own simulated MIPS processor in school...still one of the biggest highlights of my degree.
Oh so you are familiar with FPGAs already? Haha my idiot manager hired some poor bastard CS major to do FPGA development right out of college and he was lost out of his mind. Ended up letting him go. My manager is an idiot. After doing FPGA development myself for the last 5 years, I finally got on some software projects. My god, the debugger makes this shit so fucking easy. You software guys probably don't appreciate this as much as I do lol.
 
One thing I really should learn at one point is web programming. Although I can code in Java and all that, it's not that practical outside of enterprise usage. Web apps seem like they'd be a practical extention to my skillset.

What should I be looking at? Javascript, php...anything else? I'd check out Ruby On Rails, but my web host doesn't support it. :(
I was big into web programming in the late 90s, and then did nothing but dabble until recently. At some point, I decided to pick up PHP/MySQL again, and I wanted a quick way to get back up to speed.

charlequin was nice enough to recommend Robin Nixon's Learning PHP, MySQL and JavaScript:



It doesn't go terribly in depth in any of the three, but it's a quick read (as far as programming books go anyway), and it's a fantastic primer to some of the most commonly used web technologies out there. It'll give you enough to get pretty much any project started, and you can pick up other books from there, or abuse the hell out of the abundant online resources for all three.

It was a perfect recommendation for me, and it sounds like it'd be a good one for you. Good luck!

Edit: You also won't need a web host for any of it while you're in development. Look up LAMP/WAMP/MAMP packages on Google, and you'll find plenty of resources to set up local test environments quickly.
 
Oh so you are familiar with FPGAs already? Haha my idiot manager hired some poor bastard CS major to do FPGA development right out of college and he was lost out of his mind.

During the summer after my first year of a math degree, I did VHDL debugging/testing. This was before I learned any coding, and when I learned C, I said "Hey, this looks familiar!".

/coolstorybro :(

Do you guys ever feel that you'll never be good enough?
I almost always think I'm good enough, but with more experience and knowledge, I sometimes say 'WTF' in hindsight:

I'm in the process of amputating the first PHP code I ever wrote. Has worked reliably for eight years, but is just a mess. Replacing that part of the project/internal-product and some third-party software with different third-party software. Some VPs thought I would be upset to be migrating away from stuff I wrote; I'm happy to never have to refactor it, it's so bad (aside from working well ;) ).

Other parts of it (that are being kept) are much better written and I'm much happier to maintain them.
 
I just started learning Python a couple of weeks ago, and I like to take learning exercises and think up ways to make the example programs more robust as a way to soak in knowledge while progressing. For the most part I've been successful so far, but I wanted to change a user input prompt to automatically process a key input instead of pressing the key and then pressing the Enter key when using the input keyword. I tried using getche() (imported from a C++ runtime module), but I can't figure out how to make it wait until the user actually inputs something. If I put getche() in the middle of a while loop, the print statement asking for the input just runs endlessly. Is there any way to get getche() to act like input by waiting for a key to be pressed? Thanks.
 
I just started learning Python a couple of weeks ago, and I like to take learning exercises and think up ways to make the example programs more robust as a way to soak in knowledge while progressing. For the most part I've been successful so far, but I wanted to change a user input prompt to automatically process a key input instead of pressing the key and then pressing the Enter key when using the input keyword. I tried using getche() (imported from a C++ runtime module), but I can't figure out how to make it wait until the user actually inputs something. If I put getche() in the middle of a while loop, the print statement asking for the input just runs endlessly. Is there any way to get getche() to act like input by waiting for a key to be pressed? Thanks.
Sounds like you're looking for a way to process terminal input in raw mode, which is a good job for the "tty" and "termios" modules. Here's a quick-and-dirty example:
Code:
import termios
import tty
import sys
import select
import os

original_settings = termios.tcgetattr(0)
stdin_fileno = sys.stdin.fileno()
try:
    tty.setcbreak(stdin_fileno)
    while (True):
        print "hit any key to exit"
        (r_list, w_list, x_list) = select.select([stdin_fileno], [], [], 1)
        if (len(r_list)):
            char = os.read(stdin_fileno, 1)
            print "got char %s" % (repr(char))
            break
finally:
    termios.tcsetattr(0, termios.TCSADRAIN, original_settings)
Change select's wait period from 1 to 0 in order to have it wait around forever for input in stdin and you should have a good starting point. Just remember to be extra careful to have termios put settings back to normal on exit.
 
Wow I just learnt about bit fields in C from looking at the Git source code, they would have helped so much in a recent coursework I had to do. In it I wrote a function to extract specific bits and sequences of bits from a 32-bit integer (a machine code instruction) and that one function was everywhere. I could have instead written a special 32 bit struct for each instruction (including sub-structs for bits that are the same in each instruction like opcodes and flags) and then simply cast to that struct or use a union with the struct (both of which are apparently bad, heh) to extract the individual sequences from an integer.
 
Wow I just learnt about bit fields in C from looking at the Git source code, they would have helped so much in a recent coursework I had to do. In it I wrote a function to extract specific bits and sequences of bits from a 32-bit integer (a machine code instruction) and that one function was everywhere. I could have instead written a special 32 bit struct for each instruction (including sub-structs for bits that are the same in each instruction like opcodes and flags) and then simply cast to that struct or use a union with the struct (both of which are apparently bad, heh) to extract the individual sequences from an integer.
Yeah I never knew about bit fields in C until I got my current job and was looking through some code to create ethernet messages where certain bits had to be set or not as flags. I had no idea they existed but they are so useful. I wish they had this shit in Java. I had to do something similar in Java and it is like 100x the code with all the shifting and the masking and crap like that.
 
Yeah I never knew about bit fields in C until I got my current job and was looking through some code to create ethernet messages where certain bits had to be set or not as flags. I had no idea they existed but they are so useful. I wish they had this shit in Java. I had to do something similar in Java and it is like 100x the code with all the shifting and the masking and crap like that.
One has to be careful with endianness, however.
K&R's "the C Programming Language" said:
Fields are assigned from left to right on some machines and right to left on others. This means that although fields are useful for maintaining internally-defined data structures, the question of which end comes first has to be carefully considered when picking apart externally-defined data; programs that depend on such things are not portable.
 
Sounds like you're looking for a way to process terminal input in raw mode, which is a good job for the "tty" and "termios" modules. Here's a quick-and-dirty example:
Code:
import termios
import tty
import sys
import select
import os

original_settings = termios.tcgetattr(0)
stdin_fileno = sys.stdin.fileno()
try:
    tty.setcbreak(stdin_fileno)
    while (True):
        print "hit any key to exit"
        (r_list, w_list, x_list) = select.select([stdin_fileno], [], [], 1)
        if (len(r_list)):
            char = os.read(stdin_fileno, 1)
            print "got char %s" % (repr(char))
            break
finally:
    termios.tcsetattr(0, termios.TCSADRAIN, original_settings)
Change select's wait period from 1 to 0 in order to have it wait around forever for input in stdin and you should have a good starting point. Just remember to be extra careful to have termios put settings back to normal on exit.
Thanks for the example; I'll try it out when I get home. I'm not exactly sure what's going on here, with all of the function calls I've never seen before. I notice by the print statement that this is Python 2.x... will this work in 3.x?

Also, does "finally" automatically execute after "try" resolves? I've used try/except before, but not finally.
 
Yeah I never knew about bit fields in C until I got my current job and was looking through some code to create ethernet messages where certain bits had to be set or not as flags. I had no idea they existed but they are so useful. I wish they had this shit in Java. I had to do something similar in Java and it is like 100x the code with all the shifting and the masking and crap like that.
Interesting, when i took programming courses we used C++ and java standard, and to learn architecture we used C because of reasons like this, at the high level C++ does everything C will do just fine, but goin downwards C has like a one to one correlation with lower level code syntax , very useful in understanding structures when you can see how the assembler moves around the memory then easily look at the C syntax and make the connection.
 
Thanks for the example; I'll try it out when I get home. I'm not exactly sure what's going on here, with all of the function calls I've never seen before. I notice by the print statement that this is Python 2.x... will this work in 3.x?
Theoretically it should, though you'd need to change the print statements into functions if I recall correctly. Check the termios, tty and select module documentation for more details.
Also, does "finally" automatically execute after "try" resolves? I've used try/except before, but not finally.
The "finally" code always gets called when exiting a "try/finally" block, no matter what. Exiting normally, calling "return", raising an exception, or anything short of catastrophic interpreter failure will always trigger that "finally" block. I believe "with" blocks cover many of those cases also, but "finally" is a bit more generic.
 
Theoretically it should, though you'd need to change the print statements into functions if I recall correctly. Check the termios, tty and select module documentation for more details.

The "finally" code always gets called when exiting a "try/finally" block, no matter what. Exiting normally, calling "return", raising an exception, or anything short of catastrophic interpreter failure will always trigger that "finally" block. I believe "with" blocks cover many of those cases also, but "finally" is a bit more generic.
It looks like termios and tty are Unix only, anyway.
 
As for books, am I the only one that still likes a lot of O'Reilly's library?
Still my go to resource for programming. Just got Clojure Programming and it has been pretty good. However, O'Reilly's books haven't been as universally excellent as they used to be. While their quality has been on an upswing recently (it was bad during mid 2000s), there's still books that I regret getting. Still, really like the typesetting/fonts in their animal books.
 
Programming OT! Yay!


I love C++, and I do prefer it to C.

For learning though, I'd probably stick with C. I could be wrong about this, but I feel like C is just in a good place for learning with how it accomplishes different things. Some of the annoyances with C can force you to learn certain concepts. For instance, I think Java would be a very easy, but terrible first language to learn.


I was big into web programming in the late 90s, and then did nothing but dabble until recently. At some point, I decided to pick up PHP/MySQL again, and I wanted a quick way to get back up to speed.

charlequin was nice enough to recommend Robin Nixon's Learning PHP, MySQL and JavaScript:

I'm going to check this out, as I've been wanting to get into web programming as well but I've been putting it off for a while. Thanks.


Still my go to resource for programming. Just got Clojure Programming and it has been pretty good.
Are you familiar with Lisp otherwise? If not, how has the learning curve been? So many people swear by Lisp once they get over the initial hump, has me interested.
 
I first came in touch with functional programming when I did a Haskell lecture last semester, and I loved it from the first moment. If anyone wants to have a look at it, there's this nice, free and comprehensive tutorial: Learn You a Haskell for Great Good!

After reading the Ruby chapter in the great book Seven Languages in Seven Weeks, I felt like I rediscovered the magic of programming. It's just so beautiful.


Since when did writing less lines of code determine how good a language was. I thought it was all about how powerful your control was over the inner workings of the memory and logic. Obviously C or assembly would meet this but Python does a lot of the work for you and Haskell seems to do the same. Is there anything else that makes Haskell so great?
It's not just about the lines of code, it's about expressiveness. I always like Java, but after reading the aforementioned book, I almost hate it. So much code to support the task you wanna achieve instead of just doing the task.

Simple example: You've got an array with numbers (or any kind of objects) and want to filter it to contain only the ones greater than 10 (or fullfill any kind of condition).

Java:
Code:
Vector<Integer> v = ... ;
Vector<Integer> filtered = new Vector<Integer>();
for(Integer i : v) {
  if(i > 10) {
    filtered.add(i);
  }
}
v = filtered;
Ruby:
Code:
v = ...
v.select! {|n| n > 10}
That's all. That's all I wanted to do. I didn't wanted to create temporary variables or loops, I just wanted to select all numbers greater than 10.

I really do not want to use a language without closures/code blocks/lambdas anymore, they make programming so much more comfortable. I bought a Scala book recently and I'm aiming to abandon Java completely for Scala, as it's also running on the JVM and can use all existing Java classes.
 
Ruby is an awesome language, and it is a joy to use. Magical really is a good word to describe it.

People smarter than I am have said that it comes at the cost of performance, though.


e: I also love Scala! I like the way it does parameters, although it does take getting used to.
 
Oh so you are familiar with FPGAs already? Haha my idiot manager hired some poor bastard CS major to do FPGA development right out of college and he was lost out of his mind. Ended up letting him go. My manager is an idiot. After doing FPGA development myself for the last 5 years, I finally got on some software projects. My god, the debugger makes this shit so fucking easy. You software guys probably don't appreciate this as much as I do lol.
Yeah. I have a Computer Engineering degree so I worked with VHDL more than anything. I just happened to land a job in Java development.
 
Yeah. I have a Computer Engineering degree so I worked with VHDL more than anything. I just happened to land a job in Java development.
That is cool. I'm an EE that took some VHDL classes in grad school and landed a job doing that. But I have found out that I really love programming and should have just went into Comp Eng or CS from the beginning (although I really love the EE math stuff, but there isn't much use for it out in the real world it seems). I have been turned down on a few jobs because my software background is sketchy, so I am currently working on a State Machine editor in Java, that I hope to open source and put on my resume.
 
So, I have a pretty significant collection of textbooks from coursework I've had over the years, but one of my favorite book(s) going into CS way back when were these:

Dietel and Dietel

They're probably a bit expensive, but used in conjunction with coursework, or with a very strict regime of personal study, you'd have a hard time finding stuff that goes from A-Z but works in data structures, algorithms, etc., as you go. Some of the latest editions even go over some game programming with an open-source 3D engine I believe.

The books I look at the most these days are very particular to certain topics, but if you go the C or C++ route (which people often do), two must-haves would be:

C Programming

Effective C++

The first is a straight up C book that is probably the best introduction to the language one can have. It's no frills, and tightly constructed. You'd have a hard time finishing it and not knowing the fundamentals of C-anything (including the topics people become unreasonably afraid of, such as pointers)

The second is an in-depth expose on C++ pitfalls, designed for the experienced developer. It's not an entry-level read, and even if you've been using C++ for a while you'd be shocked at how much of it you didn't know.

Maybe these recommendations will be useful for the programmer who's decided that C or C++ will be their starting point. I won't recommend them as such, but I'm about as language agnostic as they come -- any road will take you where you want to be as long as you are willing to eat, breathe, bleed and snort code.
 
Are you familiar with Lisp otherwise? If not, how has the learning curve been? So many people swear by Lisp once they get over the initial hump, has me interested.
I don't actually program in Lisp. I mainly read books about Lisp-like languages because I've written a scripting language based on Lisp (easier than it sounds). I'm mainly looking at how Lisp is structured and to get ideas for my own language (for instance, how Clojure deals with built in syntax for sets and vectors). So, in a way, I knew how Lisp worked before I learned Lisp, making the curve easier. I will say that I have learned a LOT about programming from Lisp, but as I mainly write games and GUI tools, I'm not sure I would use it over something like Objective-C or Java (I've never seen anyone write a functional game simulation).

That being said, Clojure looks pretty great. I'm considering using it for writing my GUI tools. It can use the Java Swing GUI stuff, and tools are more obviously functional.

If you want to learn Lisp, check out the book "Land of Lisp". It's a decent introduction to the language, but more importantly, it's a great introduction to the things that Lisp does that makes Lisp amazing. After reading through that book, you may not be an expert at writing Lisp, but you know why people swear by it.
 
Simple example: You've got an array with numbers (or any kind of objects) and want to filter it to contain only the ones greater than 10 (or fullfill any kind of condition).

Java:

Ruby:

That's all. That's all I wanted to do. I didn't wanted to create temporary variables or loops, I just wanted to select all numbers greater than 10.

I really do not want to use a language without closures/code blocks/lambdas anymore, they make programming so much more comfortable. I bought a Scala book recently and I'm aiming to abandon Java completely for Scala, as it's also running on the JVM and can use all existing Java classes.
Here's how you do it with 1 line:

Code:
CollectionUtils.filter(myListWithNumbers, new HigherThanPredicate(10));
Turns [5,11,17,1] into [11,17].