Beep Boop Bip
[Return] [Entire Thread] [Last 50 posts]
Posting mode: Reply
Subject   (reply to 1547)
BB Code
File URL
Embed   Help
Password  (for post and file deletion)
  • Supported file types are: BMP, CSS, EPUB, FLAC, FLV, GIF, JPG, OGG, PDF, PNG, PSD, RAR, TORRENT, TXT, WEBM, ZIP
  • Maximum file size allowed is 10000 KB.
  • Images greater than 260x260 pixels will be thumbnailed.
  • Currently 770 unique user posts.
  • board catalog

File 150448609042.jpg - (110.47KB , 1280x720 , mpv-shot0028.jpg )
1547 No. 1547 [Edit]
It doesn't matter if you're a beginner or Dennis Ritchie, come here to talk about what you are doing, your favorite language and all that stuff.
I've been learning python because c++ was too hard for me (I'm sorry nenecchi I failed to you), reached OOP and it feels weird compared to the latter one, anyway I never got it completely.
Expand all images
>> No. 1549 [Edit]
File 150465284126.gif - (773.18KB , 320x240 , Dai_Mahou_Touge.gif )
I have this irrational perception that I don't want to learn any programming language because I fear it will become obsolete or superseded in the next 5 years. Is there any that almost invariably won't, maybe one in the C family or maybe even Rust?
>> No. 1551 [Edit]
C and C++ will practically never be obsolete.
>> No. 1552 [Edit]
Try a few and go with the one you like. This "if it gets obsolete then I wasted my time and will have to start from zero" is an unhealthy thinking.
>> No. 1612 [Edit]
First of all, a mainstream language won't be obsolete in just 5 years. At worst, it'll take a decade or two, maybe three, before one of these becomes obsolete. And even among niche ones, some will just never die.

Secondly, the time you spent learning a language is never lost. Most language come with a paradigms, idioms, etc. If you can program with one language, then you can pickup a similar one in less than a week (unless you want to be an expert ofc).

Finally, learning a language you might never use isn't a waste of time. Some languages are just really interesting to learn for the sake of it. They change the way you think about programming, teach you to solve problems in a (better) different way, etc.

Most programmers (and every good programmer) know more than one language anyway.
>> No. 1614 [Edit]
So, which are the mainstream languages that will take 10-30 years to become obsolete?
>> No. 1615 [Edit]
I don't understand why you'd want a language that will die in 30 years when you can pick C and C++ and be set for multiple lifetimes
>> No. 1628 [Edit]
Could you please elaborate what you meant? I'm technologically handicapped.
>> No. 1630 [Edit]
He means that C and C++ will basically never become obsolete and you should learn those two if you're worried about learning a language that will become obsolete.
>> No. 1631 [Edit]
>C and C++ will basically never become obsolete
I understood that, but not as to why. Why won't they become obsolete? Why won't there ever (within realistic expectations) a language that makes them obsolete?
>> No. 1632 [Edit]
Because they are so prolific. They have been used for years and years as the base of software. That, and from what limited knowledge I have, they are more basic languages than things like Java or C# or whatever else which are like additions built with C/C++ as the foundation.
>> No. 1638 [Edit]
File 151736164996.png - (8.33KB , 762x600 , Pic-3.png )
I really want to get into an application development language, whether it's Java, C, or C++.

I've almost solely worked with web development languages; HTML, CSS, JS(and jQuery).

I have a tough time deciding which on my own however and I'm out of touch with the best place to start, I'm not even sure what I'd program, beyond very simple games, or desktop applications to automate the bizarre needs I occasionally have.
It's not fair to say I'm completely new to Java, or C++. I did a semester of each in high school and I understand data-types and program-flow, since I've also tried many web-related languages in addition to Javascript.

I find it funny how large of a concern this is to new programmers, given how similar high-level languages are to each-other. It's a question born of ignorance. Taken at face level, if you really wanted a language that will never go obsolete try Assembly, or even better, binary.
C, and C++ have atleast another 30 years left in them. No other languages provide a closer to the metal approach and unparalleled performance in addition to a mountain of software libraries accumulated over the years.
Rust still seems like fanboy vaporware to me at this point. There's a lot of hype, but I've yet to download any software using it, and even the most competent Rust guy I know says it's not ready for serious deployment. It's the new Haskell / LISP.

That being said, it's surprising how fast web-based languages and libraries die a quick and morbid death, only to be replaced by something nearly identical, almost immediately after.
>> No. 1640 [Edit]
File 151756130680.gif - (898.01KB , 400x214 , Open.gif )
>I find it funny how large of a concern this is to new programmers
Oh, I'm not a programmer at all. I used to dabble in web design when css was just emerging, and I recall hating having to stick with formatting. After a while, I stopped practicing, since I disliked the moving from html to css and whatever knowledge I had of anything faded away. I'm just someone who would like to find a solution to the eternal problem of having to leave the house to work, and seems like learning programming can alleviate that.
>It's a question born of ignorance.
>try Assembly, or even better, binary.
>C, and C++ have atleast another 30 years left in them.
What I also meant by the frustration of language-learning and the possibility of becoming obsolete is personal intellegence boundaries. I'm not that smart, and math itself was never my forte. I'm skilled in organization and finding things in a sea of other things but that doesn't mean I can just throw myself in learning binary and achive whatever goals I can with it.

Considering even non-amateur programmers go for "easier?" options rather than C or C++ (what happened to C+?) I assume you need to have certain dispositions or certain levels of intelligence many don't have.

So, I guess I could rephrase my original question into: If I were to start today, with what language could I end up achieving from-house working in X amount of time, with average intelligence? X being whatever it takes to be proficient enough to get jobs with it.
>> No. 1644 [Edit]
File 151791747138.gif - (241.21KB , 497x331 , giphy.gif )
Hikikomori here i'm trying to find a way to make money from home without going outside what programming language do you guys recommend for beginners??.
>> No. 1648 [Edit]
File 151960936395.jpg - (111.59KB , 1280x720 , 150102648481.jpg )

start from the beginning, learn LISP with SICP.

you will be competing with freelancers from the 3rd world, who are incapable of thinking critically. You need to learn how to code, not how to write in a language. You'll have to learn the language, yes, but that is secondary to what is most important.

Being able to speak fluent english gives you a bit of an advantage, and make sure to get a github account up and going (and contributing at a REGULAR AND CONSISTENT RATE).
>> No. 1657 [Edit]
File 152234679719.gif - (747.57KB , 490x276 , Ready.gif )
>You need to learn how to code, not how to write in a language.
Does this apply to the suggestion an anon made about learning Assembly? Or this that a brainlet filter?
>and contributing at a REGULAR AND CONSISTENT RATE
Not really that acquainted with github. Do you mean commitments (updates) or actually engaging with other people and their projects there?
>> No. 1665 [Edit]
Abelson (or is that Sussman?) begins the MIT course on computer science by saying that 'computer science' is a terrible name for the subject.
Firstly it's not a science, it's more close to engineering or art. Second it's not really about computers, because the computer is but a tool for you.
Then he contrasts computer science to math by showing an equation for what a square root is, and notes that though the equation is correct it doesn't really tell you how to find one. He then shows a program that can find a square root of a number.
What programming really is about is giving precise instructions so that an abstract system can execute or compute them. The system is called a computer, but here that term defines a use for it, not the box you usually think of when encountering the word.
And understanding this fundamental idea of giving instructions is more important than any language or hardware that implements an environment where you're going to put the ideas into use.

I can't expand on Anon's suggestions about github, but it sounds like good advice. Your github page can be treated like a resume, so gather proof of your experience there. Git is also almost necessary for working on actual projects.
One thing you could also do is bounty hunting. People place bounties on software features they want to see. And then you can try to make a successful startup and sell out or something
>> No. 1666 [Edit]
Thanks for the reply, I loved the way you explained it. What would you call Computer Science if you were given the power and authority to change it forever?
>> No. 1672 [Edit]
Because money. COBOL is probably the most hated language of history. And it still exists, not only in the form of legacy code bases that nobody has the money, time or skill to replace, but some people just can't let it die (I'm looking at you IBM).

Languages such as C and C++ have been very, very, very popular. And it'd be impossible to replace all the software, libraries, etc. written in them in just 3 decades. They are also very robust, which makes them even harder to replace since they still werk.

But that's not the only reason. Nearly all mainstream languages are from what we call the "C-family", they all are heavily inspired from C and look very close to each other (if you don't have any experience with languages of other families you might not see what I mean, as you probably think that C and python are totally opposite that have little to nothing in common). Therefore learning one lets you pick up others of the C-family fairly quickly. Just for the sake of teaching, they will never really disappear. C is a really really simple language, the syntax can be learned very very quickly, and you get to learn a lot of core concepts of programming without having to worry about lots of boilerplate that means nothing to you (yet). C/++ will stay relevant for at least another 50years. Probably 100, because you'd also have to wait until all the stubborn programmers of these languages die out...
>> No. 1680 [Edit]
File 152503807397.jpg - (257.23KB , 850x1275 , Languies.jpg )
Pick one.
>> No. 1681 [Edit]
>> No. 1682 [Edit]
A toss-up between C++ or VB, the latter because she's cutest and looks lonely, and the former because it's a langauge I actually use.
>> No. 1698 [Edit]
Java is widely hated but still widely used. I have more experience with Java than any other language, though I'm trying to change that. It's verbose, has garbage collection, portable thanks to the JVM (with. couple exceptions), admittedly somewhat bloated, lots of documentation and libraries. Often seen as the poster child of the object-oriented paradigm, but has recently added more function features, such as lambda expressions. JavaFX isn't great, but it's easy enough to learn to make GUIs. One interesting thing about Java boilerplate and its verbosity is that you can write a hundred lines of Java that does essentially the same thing as a very short shell or python script. But it's still useful for a lot of things. I do think Java and OOP in general tend to overemphasize extensibility and modularity though. There are some bad design patterns and features in Java, like access modifiers or getters and setters. Not a big fan of that stuff. But it's a good way to learn about OOP and programming in general, like polymorphism, inheritance, control structures, and lots of other stuff I don't feel like writing out. Decent language despite all the flack it gets.

C++ is fast but easy to mess up with security and memory management. Widely used for things that depend on performance, such as games, but it just isn't worth the headaches. Learning C++ made me appreciate Java more -- garbage collection, references instead of pointers, and shit like that. For C++, you can use Qt or GTK. I personally never got into GUI development for C++, though I did for Java.

Python is okay. It's used for machine learning, Django (web dev backend), learning programming, and so on. "Forced indentation of code" is a meme on /prog/, since some people find it annoying that organization is syntax in this language. I'm surprised Python 2.X still exists, and it shows how making changes can cause fragmentation in a community. More people are adopting Python 3.X though, which is good. Paths for different versions of Python can be annoying. I've worked on a Project that involved a tool called Anaconda though, which made it easy for everyone to make sure they had the right versions of Django and Python and whatnot, to avoid the "well, it works on my machine" issues many people have. Lots of modules and community support too. However, Guido recently stepped down as BDFL, so who knows what the future of Python will be.

Ruby is dying. People use Django or Node instead of Rails. It's slow, basically competing with Python in the realm of relatively simple interpreted languages. Never really heard of it being used outside of RPG Maker XP and Ruby on Rails. Wouldn't recommend getting into it. It's a sinking ship.

PHP is a pile of garbage. It's ancient, full of security holes and black magic fuckery, and it should be avoided at all costs. Maybe you'll end up using it when maintaining some legacy web codebase, but it sucks ass. I know of some cool tricks for hacking poorly-coded PHP sides, using fun things like remote file inclusion and web shells. Interesting from an attacker's standpoint, but annoying as fuck if you're a web developer who has to deal with this. Don't use PHP.

C# is like Java, but for Microsoft shills. I never got into it, but it's only worthwhile if you're gung-ho about Microsoft (which I'm not).

JavaScript, despite all the hate (some if it deserved, for its weird quirks!), is one of the most important programming languages to learn. These days, you can't really do any web development without JavaScript. Very few people use vanilla JS, but you use shit like Angular, Node, Express, jQuery, and so on. EcmaScript 6 got class-based inheritance instead of the old weird prototypical inheritance of ES5, which is in JS. It's weird how JS is based on ES, but they're not exactly the same. I never really understood that. Anyway, JavaScript Object Notation is cool too, and you can even use it with non-web stuff. It's a cool alternative to XML. I'd never use XML these days. NoSQL/document-based databases like MongoDB are cool, and a good start if you're learning JS. With MEAN stack, you can have JS frontend, JS backend, and a JSON database. Makes things slightly easier, even though web dev is pretty complicated now.

Perl is extremely terse to the point of being unreadable. There are lots of cool one-liners you can do with it, and sometimes I even use a little bit of perl in my shell scripts, but overall, it's not really usable. I've seen some older sites using perl, but it's not aging well for modern concepts such as responsive design. Sort of reminds me of old-school cgi. I'm no perl connoisseur, but apparently perl6 was slow to be adopted. Many people stick with perl5, just like the split between Python 2 and Python 3. Wouldn't bother learning perl in 2018.

C is old-school procedural programming. Fast, but simple. You could think of it as like C++ without the OOP, or the other way around: C++ is like C but with classes tacked on as an afterthought. Wanna learn about pointers and compiling and other more old-school stuff? I guess you could learn it with C (or C++). But in the real world, you're not likely to use it, except for a CS undergrad class, or maybe legacy code. C++? Sure. Pure C? Not so much. Maybe if you do embedded systems shit where resources are tight, or you really need that extra performance squeezed out of something. But then again, it's good to have knowledge of non-OOP paradigms, like procedural, imperative, and perhaps even functional (though I'd never recommend function languages for anything other than messing around -- very few jobs with it, too obsessed with concurrency, no return statements, weird concepts like currying and monads, only real cool thing is lambda expressions, which I use in Java all the time). Anyway, got a little off-topic, but C is kinda old-school and not really something you'd want to base a modern project off of. If you want something faster, kind of like C or C++, maybe look into Rust, which is similar but with better memory safety built in.

I never got into Visual Basic.

I've heard good things about R, but I've never actually used it.

Never used Scala. Don't know much about it.

Shells are not programming languages. Shells are shells. Technically, you can do shell scripting, which is useful. But would I call that fully-fledged programming? Not so sure about that. I do a lot of bash/zsh one-liners and I like customizing my .zshrc and making cron jobs and all that jazz (though systemd is subsuming all that shit nowadays). Lots of cool and powerful stuff with shells. Definitely worth learning command line stuff if you want to program, no matter which OS or programming language you use. But be warned: PowerShell is a joke compared to Unix shells like bash or zsh.

ActionScript? Flash is dead. Not worth your time.

Other languages that aren't on that list, but are worth mentioning:

Swift -- optionals/nils are interesting, also it's what you need to make iOS apps, since Objective-C is being phased out

Kotlin -- successor to Java as the go-to language for Android development

Haskell (for academic or hobbyist purposes)



Assembly, such as MIPS or ARM or x86: if you wanna learn more about how computers really work, it can be useful to learn about assembly. You'll learn about registers and all that good stuff. It's a total pain in the ass. You'll appreciate higher-level languages more after doing shit in assembly. Assembly is very simple -- and that's the problem. It's hard to get an idea of how combinations of jumps and pushes do anything. Higher-level languages introduce extra layers of abstraction, so you can think more about the problem you're trying to solve and less about CPU registers and whatnot. A lot of compilers compile to assembly, because it's pretty hard to read or reverse engineer (though some languages compile to bytecode instead). But if you want to get into reverse engineering or malware, assembly is more important to learn. But it doesn't make sense to try and make a program in assembly as opposed to something like Java or C++ instead.

Markup and "that's technically not a programming language" languages: HTML5, CSS3, Markdown, YAML, LaTeX, preprocessors such as Sass, and so on. Still important to know. Might not be turing complete, but so what? Still useful to know. Pedants argue about what to call them, but regardless, you should know some of them anyway.

When some people ask "which programming language should I learn?" the answer is: many programming languages. You might only speak one language in daily life, but you might need to use multiple programming languages as a programmer. They have different design philosophies, different built-in methods, different libraries, run on different devices, and have different use-cases. They're not all general purpose, and even so-called general purpose languages are better for some things and worse for others.

There is no "best" language to learn, but stop obsessing over which one is best and just pick something. I'd suggest Python, HTML/CSS/JS, or Java to start with. When you learn your first programming language, you're learning programming, paradigm-specific stuff, and language-specific stuff. Then, for your second programming language (assuming it's in the same paradigm, which should ideally be OOP-based, even if it's not 100% OOP), all you're really learning is the syntax and language-specific stuff. It's way easier to learn another programming language after you've already learned your first one.

It's easy to learn programming, but only if you have realistic time expectations. If you think you're gonna make a game in a day, you'll be distressed by how complicated everything sounds. Rome wasn't built in a day. So you have to pace yourself, like learning linked lists one day, then stacks and queues, then binary trees the next, time complexity the day after that, and so on. That's not a really good order, just an example. But that brings up another topic: there is a difference between learning the basics of a language, and learning more in-depth topics, such as algorithms, data structures, software engineering, project management, best practices, devops/agile, tools, debugging, design patterns, etc.
>> No. 1699 [Edit]
>if you really wanted a language that will never go obsolete try Assembly
Terrible advice, considering the limited use-cases of Assembly, and also how CPU architectures change over time. x86 might be hot now, but it's being overtaken by ARM. Different CPU architectures have different assembly code. I learned MIPS assembly in college, and it's pretty much useless. I don't even put it on my resume.
>> No. 1784 [Edit]
I'm aiming for a career in Mechanical Engineering so programming for me is going to be something I do to assist with prototyping. That is to say, it'll be primarily more of a hobbyist activity. On that note, I've decided to learn C first, then Forth, then Lisp and then Ruby.

C because I hear its simple yet powerful, "bare metal" programming language. I believe it also teaches you about how the computer works internally.
Forth because I hear it gets used in the Aeroplane and Space sectors of Engineering. It's even closer to the "metal" than C, I hear. I heard it gives the programmer immense control and presents its own unique problems to programming that challenge you to think differently.
Lisp because I hear it will induce some kind of revelation about programming when you come to understand it. I hear that it's perfect for when you don't know what you should be doing because Lisp code is flexible enough to be taken in a new direction with relative ease.
Ruby because I predict I'm going to want a more friendly language to code in for my own applications in the house when I'm not trying to be maximally efficient.
I also consider Ada and Fortran as possibilities.

My most recent completed project was a program that could print a list of all the palindromes from 10 to 1,000,000 without using string manipulation (since I have yet to learn it).
>> No. 1785 [Edit]
I would recommend LISP first, as you will get a better understanding of the science in computer science and better illustrates benefits of doing things different ways (especially recursion).
C is great exactly for the reasons you list, but the manual memory manipulation crap and a lot of C's idiosyncrasies (including the compilers) will slow down the general concept stuff that you should be learning first and foremost.

I would also recommend python over ruby for a language following what you are trying to accomplish with it, as python is a lot better supported overall, and a much larger library of publicly available code to stealborrow from.
>> No. 1786 [Edit]
Thanks for the support but unfortunately, I've already bought books on C to get me started with it. I'm going to press on with C just for the sake of reassuring myself that I didn't waste my money. Considering what you said though, I guess I could make Lisp second and I'll learn it along with watching those SICP videos on Youtube before going into Forth.

I've heard that Python is popular for being popular so I'm somewhat aware of how well supported it is. When I look at Ruby code though and how it's all so sterilised of all of the "computer" things that you usually find in software code, I feel really drawn to it. It seems really comfortable. I'll just have to keep Python in mind as well.
Thank you.
>> No. 1787 [Edit]
File 154943397236.jpg - (85.93KB , 706x455 , slap.jpg )
learning c++ atm. I need to learn data structures and algorithms but i keep getting blocked with math and big 0.
>> No. 1955 [Edit]
File 157517725294.jpg - (201.97KB , 1920x1080 , fecd8526c650eba6709a3a9fe9c7666e.jpg )
If you want a fun language to use, I recommend trying D. Its template system is awesome and a lot easier to use than the one in C++, as I recently discovered in a project that I'm working. Wish it was more popular, however. Its rough edges could be smoothed out if it had more manpower.
>> No. 1958 [Edit]
File 157596783467.jpg - (431.80KB , 1080x2160 , Screenshot_20191210_083838_com_termux.jpg )
I can definitely recommend learning assembly. It helped me tremendously to get an intuitive understanding of how computers work. Many of the things in C++ and other languages that I had previously found hard to grasp (e.g. binary logic, bit shifts, flags, two's complement notation etc.) are like second nature to me now because of dealing with them all the time when writing assembly code.

As your first programming language, I recommend either learning something very high-level (Haskell, Python) where you can focus on learning algorithmic thinking without having to deal much with things like memory management, integer overflows and so on, and also start writing somewhat useful software early on; or start from the ground up with assembly so that you can focus on learning exactly how computers deal with binary numbers, character encodings, memory addresses, registers, vector operations, stacks, system calls and so on so that none of this will present a hurdle later on in your programming career.
Avoid starting with the languages in the middle (C/C++, Java, Rust) where you have to deal with both at the same time.
That said, if there's something very specific that interests you, learn whatever is most useful in that field: Javascript for web development, C for microcontrollers/Arduino, Shell-scripting if you're a Linux user, Java if you want to write Android apps.

Some decent Assembly books aimed at beginners that deal with modern processors are Jeff Duntemann's Assembly Step-by-Step, Assembly Language Coding in Color - ARM and NEON, and Programming From the Ground up. Or just get an emulator for an 80's computer like the ZX Spectrum or Commodore 64 and read one of the countless beginner's books on assembly for Z80 or 6502 CPUs.

A good book on higher-level/philosophical computer science concepts that doesn't require a strong mathematical background is Understanding Computation from O'Reilly. Other than that, just look at what universities teach in their curricula.

C was arguably the first language that was a good abstraction of how a computer works without being specific to one type of processor. That's why it's great for writing things like operating systems and other things where speed is important, while still remaining portable across different computers. You can run Linux on both your desktop computer (Intel processor architecture) and your phone (ARM processor) because it's written in C. Can't do that in assembly because you'd have to rewrite it for every type of processor, and can't do that with LISP or Python because it's too slow. So everyone back in the day started writing all these fundamental systems (many of which are still around from the 70's/80's) in C, and it will basically never go away - unless Urbit really takes off and the feudal internet aristocracy of the future writes everything in Hoon.
>> No. 1959 [Edit]
File 15759893559.gif - (44.46KB , 640x400 , download (1).gif )
What do you think about MATLAB? The most complex thing I've done with it is matrices, structures and that kind of thing. That's basically all I know, though I've tried dabbling in c before using that shitty, learn c in 24 hours guide. Where should I go next? Algorithms? At what point do you really "know" a language and move on to learning something else? Also, have you heard of Introduction to Microcomputers series by Adam Osborne? I haven't finished it, but it gave a very interesting glimpse into the hardware side of things. You learned x86, right?

Intersting links I can't do anything with, but maybe you or somebody else could:
I remember finding the 98 bible, but I don't have the link for whatever reason.
>> No. 1960 [Edit]
If you want to start with assembly, maybe also take a look at riscv. The spec is pretty clean and since it's from the risc lineage the instruction set is self-contained and easy to understand. One of the projects in my uni was to build a cpu (in circuit simulation software) and I was surprised at how compact it ended up being. Unfortunately the tooling and the ecosystem is still somewhat janky at the moment, but it's worth looking into since there's a chance it might take off in the future.

Not the poster you were responding to, but MATLAB's a cool language for any sort of numerical computing. It's starting to fade a bit now that there's numpy+python, but the lack of operator overloading sort of hurts python here. If you need one of the specialized toolboxes then there's no other real choice, but otherwise making the transition to numpy shouldn't be too unfamiliar, and it will be a good introduction to python.
>> No. 1961 [Edit]
>Where should I go next?
I suggest just writing a lot of programs and looking up stuff on search engines and in reference manuals as needed.
Since we're in Christmas season, try doing the programming puzzles at and see how far you can get in each year's calendar.
As your skills as a programmer progress, you'll also see more and more opportunities for contributing to open source projects opening up.
You'll eventually run into a roadblock due to lack of knowledge, and that's always a good pointer towards what you should learn next.

If you're looking to learn more about how CPU's work, check out and maybe the Nand2Tetris course on which it is based. It basically leads you through the process of building an entire computer from just above the transistor level.

Since you know some C, you could look into Arduino microcontrollers, which let you control electronics (think LED's, sensors, buttons, small LCD screens etc.), if you find that sort of thing interesting at all.

If you feel like it, sure. Can never know enough about that topic. It all depends on your goals and what interests you. As I said, looking at what universities teach in their curricula is a good guideline when you're not sure what to read about next.
>At what point do you really "know" a language and move on to learning something else?
I recommend sticking with whatever language you know until you want to solve a problem it is simply not suited for. If you want to make websites for example, C isn't the right tool (except for some back-end stuff).
If you want to learn a new language just for the heck of it, my suggestion is to pick a language that's suitable for understanding a different programming paradigm than what you're used to -- such as Haskell for functional programming, Ruby for OOP, and some sort of assembly language.

>You learned x86, right?
It's what I started learning assembly on, using Duntemann's excellent tutorial, but I never wrote much in it. Had to drop the book half-way through because I became homeless and didn't have access to a desktop computer most of the time.
The architecture I'm most familiar with is ARM, in both the 32-bit and the 64-bit variant. That's what my phone has, and I have to say that ARM assembly is much more readable and intuitive than its x86 counterpart, and not nearly as mind-bogglingly complex. I can echo >>1960's recommendation to pick a RISC architecture for learning how assembly works, and ARM (formerly Acorn RISC Machines) is one of those. Most of what I know about it is straight from the documentation on ARM's website.
The one I'm second-most familiar with is the Z80, which has good official documentation and is very simple, but is much less consistent and logical than ARM.
If you're interested in reverse-engineering Windows apps or something, you obviously won't get around learning x86 though.
>> No. 1962 [Edit]
>If you're interested in reverse-engineering Windows apps or something, you obviously won't get around learning x86 though.
Well, I'd like to be able to do something with the pc-98 one day, but that's a bit of a pipe dream.
>> No. 1963 [Edit]
You might find this fun reading then:
>> No. 1967 [Edit]
>I'd like to be able to do something with the pc-98 one day, but that's a bit of a pipe dream.
Looks like a fun toy. Why do you think it's a pipe dream, given how much documentation for it is out there? Is it that you can't read Japanese?

Do you have real hardware or are you planning to do everything in an emulator?
>> No. 1968 [Edit]
File 157610422449.gif - (47.99KB , 640x399 , VGNyu1s.gif )
>Is it that you can't read Japanese?
Yep. I'm in the process of learning it. I'd also have to learn a substantial amount about the 98's dialect of x86, which isn't a walk in the park, and Japanese tech terminology, but a lot of that is hopefully written in katakana. If I do manage, it wont be any time soon. Learning how to program for the z80 first might be beneficial since it's similar and simpler.
>Do you have real hardware or are you planning to do everything in an emulator?
Emulator. Is there much advantage to working on real hardware?
>> No. 1969 [Edit]
>Emulator. Is there much advantage to working on real hardware?
depends on the emulator. If the emulator is inaccurate, your software may behave in unexpected ways once someone does try to run it on real hardware.
On the other hand, emulators may have nice debugging features that are superior to whatever you could find on real hardware. Even just save-states are a blessing in this regard.
At the very least, making frequent backups of your work will be a lot easier.

Most of my experience with Z80 programming comes from writing shitty demos for the Sega Master System (which I can't recommend if your goal is to learn about Z80 programming, because the documentation isn't nearly as good as for the popular personal computers of the time). I was mostly using Emulicious, which has a useful debugger, but I'm fairly certain that none of the programs I wrote would even boot on real hardware. I'd have to change around at least a few things in the file headers, probably some of the actual code too.
>> No. 1970 [Edit]
File 157619777878.png - (144.83KB , 1366x768 , 1484437985663.png )
There's a branch of the Neko Project emulator which can run windows 98. I don't know how possible that is on any physical models. I do know different models have different specs and some stuff that would work on one might not on another. Emulators are better for experimental type stuff since you can adjust the specs how ever you want, even with greater capabilites than any real model of a system.
>> No. 1971 [Edit]
File 157620045928.png - (849.23KB , 2000x1125 , programming_challenges.png )
If you want to do something on the PC-98, why not start with doing stuff in BASIC?
>> No. 1972 [Edit]
That image is a troll-post right? I mean they're good projects, but a few of them seem to have their difficulties completely off. E.g. why are "Game of Life" and "English sentence parser" both medium. The former is a straightforward recursive program while the latter is a relatively sophisticated NLP project (unless you just call into a pre-existing library). Similarly why is "text editor" hard but "javascript debugger" medium.
>> No. 1973 [Edit]
>Design a Game Engine in Unity
A game engine within a game engine?
>> No. 2026 [Edit]
File 159089471993.jpg - (270.64KB , 500x650 , 2f502e6140df7ce9868c2f1b3db5f5a1.jpg )
I'm reading how to design programs. It's a scheme book. There's a newer edition out for racket, but I started the edition I did before knowing that. It's far from my first exposure to programming, but it's the first time I'm learning it seriously. The exercises are tough and I have to look at the answer a lot. Recursion is tough. Mutual recursion is tough. I'm doubting myself a bit.
>> No. 2027 [Edit]
There are those who interview for a programming job but cannot implement fizz buzz or similarly trivial constructs despite graduating with a CS degree. And yet, these people do get jobs. Masturbation is the only thing left.
>> No. 2028 [Edit]
Conversely you also have those with demonstrated experience who get asked gotcha brainteasers.
>> No. 2029 [Edit]
Interviews are a joke.
>> No. 2030 [Edit]
Working is a joke.
>> No. 2031 [Edit]
Do you think masturbation could help you in an interview? I have to try that one.
>> No. 2032 [Edit]
I want to hear people's opinions on Rust. Things like ripgrep have piqued my interest.

Yes, there's no better way to assert your dominance.
On the other hand, you could end up as a sex offender.
>> No. 2033 [Edit]
Ripgrep is very nice (in fact all tools by that author are very handy).
There was also a brief discussion of rust in /ot/ (

I think there's a lot of neat ideas there from a PL theory perspective (enforced lifetime tracking) and a practical (succinct, helpful compiler messages). I'd like to see them make their way to c++ as well (llvm community is doing some work on improving static analyzers).
>> No. 2034 [Edit]
I'm not sure any of this can be brought into cpp, the language has so much legacy and so many features so at this point it is nigh impossible to add anything without breaking at least *something*.
And PL theory, sadly, goes against good error messages, well Standard ML and OCaml have lean and mean error messages, while Haskell is just horrendous in this regard, and when you add advanced type level features into the mix. Well, you now can compare errors (at least in kilobytes) to ones you get from templates in cpp.
>> No. 2035 [Edit]
Apparently Rust's type system is formalized via the notion of affine types, where every variable can be used at most once. There are also linear types where a variable can be used exactly once. Wikipedia gives C++'s unique_ptr as an example of a linear type, but to me it seems like an affine type instead since you can always choose to discard it (just let it go out of scope).

It's also not clear to me why they're called linear/affine.
>> No. 2044 [Edit]
They are called so because they came from Linear/Affine branches of logic, where you can use proofs once/at most once.
>> No. 2051 [Edit]
File 159649336968.png - (62.14KB , 870x666 , code.png )
I made this hashmap for up to 8 characters in c by deferencing the strings.
Probably useless but I think it's pretty funny.
>> No. 2052 [Edit]
Oh I think I understood what's going on there. I was confused at first because you mentioned hashmap, but what it's doing is re-interpreting the sequence of bytes "Cat....." or "Hello..." as an int64, which can be thought of as a pseudo-"hash". It's more like a fixed lookup table, and an interesting way of working around the fact that C doesn't support switch statements on strings.
>> No. 2053 [Edit]
Oh yeah I didn't even map any values.
No matter. I just looked at the assembly and no matter how many cases I add it still ends up being just a series of if statements, so it is completely useless!

View catalog

Delete post []
Report post

[Home] [Manage]

[ Rules ] [ an / foe / ma / mp3 / vg / vn ] [ cr / fig / navi ] [ mai / ot / so / tat ] [ arc / ddl / irc / lol / ns / pic ] [ home ]