Beep Boop Bip
[Return] [Entire Thread] [Last 50 posts]
Posting mode: Reply
Name
Email
Subject   (reply to 1547)
Message
BB Code
File
File URL
Embed   Help
Password  (for post and file deletion)
  • Supported file types are: BMP, CSS, EPUB, FLAC, FLV, GIF, JPG, OGG, PDF, PNG, PSD, RAR, TORRENT, TXT, WEBM, ZIP
  • Maximum file size allowed is 10000 KB.
  • Images greater than 260x260 pixels will be thumbnailed.
  • Currently 797 unique user posts.
  • board catalog

File 150448609042.jpg - (110.47KB , 1280x720 , mpv-shot0028.jpg )
1547 No. 1547 [Edit]
It doesn't matter if you're a beginner or Dennis Ritchie, come here to talk about what you are doing, your favorite language and all that stuff.
I've been learning python because c++ was too hard for me (I'm sorry nenecchi I failed to you), reached OOP and it feels weird compared to the latter one, anyway I never got it completely.
Expand all images
>> No. 1549 [Edit]
File 150465284126.gif - (773.18KB , 320x240 , Dai_Mahou_Touge.gif )
1549
I have this irrational perception that I don't want to learn any programming language because I fear it will become obsolete or superseded in the next 5 years. Is there any that almost invariably won't, maybe one in the C family or maybe even Rust?
>> No. 1551 [Edit]
>>1549
C and C++ will practically never be obsolete.
>> No. 1552 [Edit]
>>1549
Try a few and go with the one you like. This "if it gets obsolete then I wasted my time and will have to start from zero" is an unhealthy thinking.
>> No. 1612 [Edit]
>>1549
First of all, a mainstream language won't be obsolete in just 5 years. At worst, it'll take a decade or two, maybe three, before one of these becomes obsolete. And even among niche ones, some will just never die.

Secondly, the time you spent learning a language is never lost. Most language come with a paradigms, idioms, etc. If you can program with one language, then you can pickup a similar one in less than a week (unless you want to be an expert ofc).

Finally, learning a language you might never use isn't a waste of time. Some languages are just really interesting to learn for the sake of it. They change the way you think about programming, teach you to solve problems in a (better) different way, etc.

Most programmers (and every good programmer) know more than one language anyway.
>> No. 1614 [Edit]
>>1612
So, which are the mainstream languages that will take 10-30 years to become obsolete?
>> No. 1615 [Edit]
>>1614
I don't understand why you'd want a language that will die in 30 years when you can pick C and C++ and be set for multiple lifetimes
>> No. 1628 [Edit]
>>1615
Could you please elaborate what you meant? I'm technologically handicapped.
>> No. 1630 [Edit]
>>1628
He means that C and C++ will basically never become obsolete and you should learn those two if you're worried about learning a language that will become obsolete.
>> No. 1631 [Edit]
>>1630
>C and C++ will basically never become obsolete
I understood that, but not as to why. Why won't they become obsolete? Why won't there ever (within realistic expectations) a language that makes them obsolete?
>> No. 1632 [Edit]
>>1631
Because they are so prolific. They have been used for years and years as the base of software. That, and from what limited knowledge I have, they are more basic languages than things like Java or C# or whatever else which are like additions built with C/C++ as the foundation.
>> No. 1638 [Edit]
File 151736164996.png - (8.33KB , 762x600 , Pic-3.png )
1638
I really want to get into an application development language, whether it's Java, C, or C++.

I've almost solely worked with web development languages; HTML, CSS, JS(and jQuery).

I have a tough time deciding which on my own however and I'm out of touch with the best place to start, I'm not even sure what I'd program, beyond very simple games, or desktop applications to automate the bizarre needs I occasionally have.
It's not fair to say I'm completely new to Java, or C++. I did a semester of each in high school and I understand data-types and program-flow, since I've also tried many web-related languages in addition to Javascript.

>>1549
I find it funny how large of a concern this is to new programmers, given how similar high-level languages are to each-other. It's a question born of ignorance. Taken at face level, if you really wanted a language that will never go obsolete try Assembly, or even better, binary.
C, and C++ have atleast another 30 years left in them. No other languages provide a closer to the metal approach and unparalleled performance in addition to a mountain of software libraries accumulated over the years.
Rust still seems like fanboy vaporware to me at this point. There's a lot of hype, but I've yet to download any software using it, and even the most competent Rust guy I know says it's not ready for serious deployment. It's the new Haskell / LISP.

That being said, it's surprising how fast web-based languages and libraries die a quick and morbid death, only to be replaced by something nearly identical, almost immediately after.
>> No. 1640 [Edit]
File 151756130680.gif - (898.01KB , 400x214 , Open.gif )
1640
>>1638
>I find it funny how large of a concern this is to new programmers
Oh, I'm not a programmer at all. I used to dabble in web design when css was just emerging, and I recall hating having to stick with formatting. After a while, I stopped practicing, since I disliked the moving from html to css and whatever knowledge I had of anything faded away. I'm just someone who would like to find a solution to the eternal problem of having to leave the house to work, and seems like learning programming can alleviate that.
>It's a question born of ignorance.
Indeed.
>try Assembly, or even better, binary.
>C, and C++ have atleast another 30 years left in them.
What I also meant by the frustration of language-learning and the possibility of becoming obsolete is personal intellegence boundaries. I'm not that smart, and math itself was never my forte. I'm skilled in organization and finding things in a sea of other things but that doesn't mean I can just throw myself in learning binary and achive whatever goals I can with it.

Considering even non-amateur programmers go for "easier?" options rather than C or C++ (what happened to C+?) I assume you need to have certain dispositions or certain levels of intelligence many don't have.

So, I guess I could rephrase my original question into: If I were to start today, with what language could I end up achieving from-house working in X amount of time, with average intelligence? X being whatever it takes to be proficient enough to get jobs with it.
>> No. 1644 [Edit]
File 151791747138.gif - (241.21KB , 497x331 , giphy.gif )
1644
Hikikomori here i'm trying to find a way to make money from home without going outside what programming language do you guys recommend for beginners??.
>> No. 1648 [Edit]
File 151960936395.jpg - (111.59KB , 1280x720 , 150102648481.jpg )
1648
>>1644

start from the beginning, learn LISP with SICP.

you will be competing with freelancers from the 3rd world, who are incapable of thinking critically. You need to learn how to code, not how to write in a language. You'll have to learn the language, yes, but that is secondary to what is most important.

Being able to speak fluent english gives you a bit of an advantage, and make sure to get a github account up and going (and contributing at a REGULAR AND CONSISTENT RATE).
>> No. 1657 [Edit]
File 152234679719.gif - (747.57KB , 490x276 , Ready.gif )
1657
>>1648
>You need to learn how to code, not how to write in a language.
Does this apply to the suggestion an anon made about learning Assembly? Or this that a brainlet filter?
>and contributing at a REGULAR AND CONSISTENT RATE
Not really that acquainted with github. Do you mean commitments (updates) or actually engaging with other people and their projects there?
>> No. 1665 [Edit]
>>1657
Abelson (or is that Sussman?) begins the MIT course on computer science by saying that 'computer science' is a terrible name for the subject.
Firstly it's not a science, it's more close to engineering or art. Second it's not really about computers, because the computer is but a tool for you.
Then he contrasts computer science to math by showing an equation for what a square root is, and notes that though the equation is correct it doesn't really tell you how to find one. He then shows a program that can find a square root of a number.
What programming really is about is giving precise instructions so that an abstract system can execute or compute them. The system is called a computer, but here that term defines a use for it, not the box you usually think of when encountering the word.
And understanding this fundamental idea of giving instructions is more important than any language or hardware that implements an environment where you're going to put the ideas into use.

I can't expand on Anon's suggestions about github, but it sounds like good advice. Your github page can be treated like a resume, so gather proof of your experience there. Git is also almost necessary for working on actual projects.
One thing you could also do is bounty hunting. People place bounties on software features they want to see. And then you can try to make a successful startup and sell out or something
>> No. 1666 [Edit]
>>1665
Thanks for the reply, I loved the way you explained it. What would you call Computer Science if you were given the power and authority to change it forever?
>> No. 1672 [Edit]
>>1631
Because money. COBOL is probably the most hated language of history. And it still exists, not only in the form of legacy code bases that nobody has the money, time or skill to replace, but some people just can't let it die (I'm looking at you IBM).

Languages such as C and C++ have been very, very, very popular. And it'd be impossible to replace all the software, libraries, etc. written in them in just 3 decades. They are also very robust, which makes them even harder to replace since they still werk.

But that's not the only reason. Nearly all mainstream languages are from what we call the "C-family", they all are heavily inspired from C and look very close to each other (if you don't have any experience with languages of other families you might not see what I mean, as you probably think that C and python are totally opposite that have little to nothing in common). Therefore learning one lets you pick up others of the C-family fairly quickly. Just for the sake of teaching, they will never really disappear. C is a really really simple language, the syntax can be learned very very quickly, and you get to learn a lot of core concepts of programming without having to worry about lots of boilerplate that means nothing to you (yet). C/++ will stay relevant for at least another 50years. Probably 100, because you'd also have to wait until all the stubborn programmers of these languages die out...
>> No. 1680 [Edit]
File 152503807397.jpg - (257.23KB , 850x1275 , Languies.jpg )
1680
Pick one.
>> No. 1681 [Edit]
>>1680
C
>> No. 1682 [Edit]
>>1680
A toss-up between C++ or VB, the latter because she's cutest and looks lonely, and the former because it's a langauge I actually use.
>> No. 1698 [Edit]
>>1680
Java is widely hated but still widely used. I have more experience with Java than any other language, though I'm trying to change that. It's verbose, has garbage collection, portable thanks to the JVM (with. couple exceptions), admittedly somewhat bloated, lots of documentation and libraries. Often seen as the poster child of the object-oriented paradigm, but has recently added more function features, such as lambda expressions. JavaFX isn't great, but it's easy enough to learn to make GUIs. One interesting thing about Java boilerplate and its verbosity is that you can write a hundred lines of Java that does essentially the same thing as a very short shell or python script. But it's still useful for a lot of things. I do think Java and OOP in general tend to overemphasize extensibility and modularity though. There are some bad design patterns and features in Java, like access modifiers or getters and setters. Not a big fan of that stuff. But it's a good way to learn about OOP and programming in general, like polymorphism, inheritance, control structures, and lots of other stuff I don't feel like writing out. Decent language despite all the flack it gets.

C++ is fast but easy to mess up with security and memory management. Widely used for things that depend on performance, such as games, but it just isn't worth the headaches. Learning C++ made me appreciate Java more -- garbage collection, references instead of pointers, and shit like that. For C++, you can use Qt or GTK. I personally never got into GUI development for C++, though I did for Java.

Python is okay. It's used for machine learning, Django (web dev backend), learning programming, and so on. "Forced indentation of code" is a meme on /prog/, since some people find it annoying that organization is syntax in this language. I'm surprised Python 2.X still exists, and it shows how making changes can cause fragmentation in a community. More people are adopting Python 3.X though, which is good. Paths for different versions of Python can be annoying. I've worked on a Project that involved a tool called Anaconda though, which made it easy for everyone to make sure they had the right versions of Django and Python and whatnot, to avoid the "well, it works on my machine" issues many people have. Lots of modules and community support too. However, Guido recently stepped down as BDFL, so who knows what the future of Python will be.

Ruby is dying. People use Django or Node instead of Rails. It's slow, basically competing with Python in the realm of relatively simple interpreted languages. Never really heard of it being used outside of RPG Maker XP and Ruby on Rails. Wouldn't recommend getting into it. It's a sinking ship.

PHP is a pile of garbage. It's ancient, full of security holes and black magic fuckery, and it should be avoided at all costs. Maybe you'll end up using it when maintaining some legacy web codebase, but it sucks ass. I know of some cool tricks for hacking poorly-coded PHP sides, using fun things like remote file inclusion and web shells. Interesting from an attacker's standpoint, but annoying as fuck if you're a web developer who has to deal with this. Don't use PHP.

C# is like Java, but for Microsoft shills. I never got into it, but it's only worthwhile if you're gung-ho about Microsoft (which I'm not).

JavaScript, despite all the hate (some if it deserved, for its weird quirks!), is one of the most important programming languages to learn. These days, you can't really do any web development without JavaScript. Very few people use vanilla JS, but you use shit like Angular, Node, Express, jQuery, and so on. EcmaScript 6 got class-based inheritance instead of the old weird prototypical inheritance of ES5, which is in JS. It's weird how JS is based on ES, but they're not exactly the same. I never really understood that. Anyway, JavaScript Object Notation is cool too, and you can even use it with non-web stuff. It's a cool alternative to XML. I'd never use XML these days. NoSQL/document-based databases like MongoDB are cool, and a good start if you're learning JS. With MEAN stack, you can have JS frontend, JS backend, and a JSON database. Makes things slightly easier, even though web dev is pretty complicated now.

Perl is extremely terse to the point of being unreadable. There are lots of cool one-liners you can do with it, and sometimes I even use a little bit of perl in my shell scripts, but overall, it's not really usable. I've seen some older sites using perl, but it's not aging well for modern concepts such as responsive design. Sort of reminds me of old-school cgi. I'm no perl connoisseur, but apparently perl6 was slow to be adopted. Many people stick with perl5, just like the split between Python 2 and Python 3. Wouldn't bother learning perl in 2018.

C is old-school procedural programming. Fast, but simple. You could think of it as like C++ without the OOP, or the other way around: C++ is like C but with classes tacked on as an afterthought. Wanna learn about pointers and compiling and other more old-school stuff? I guess you could learn it with C (or C++). But in the real world, you're not likely to use it, except for a CS undergrad class, or maybe legacy code. C++? Sure. Pure C? Not so much. Maybe if you do embedded systems shit where resources are tight, or you really need that extra performance squeezed out of something. But then again, it's good to have knowledge of non-OOP paradigms, like procedural, imperative, and perhaps even functional (though I'd never recommend function languages for anything other than messing around -- very few jobs with it, too obsessed with concurrency, no return statements, weird concepts like currying and monads, only real cool thing is lambda expressions, which I use in Java all the time). Anyway, got a little off-topic, but C is kinda old-school and not really something you'd want to base a modern project off of. If you want something faster, kind of like C or C++, maybe look into Rust, which is similar but with better memory safety built in.

I never got into Visual Basic.

I've heard good things about R, but I've never actually used it.

Never used Scala. Don't know much about it.

Shells are not programming languages. Shells are shells. Technically, you can do shell scripting, which is useful. But would I call that fully-fledged programming? Not so sure about that. I do a lot of bash/zsh one-liners and I like customizing my .zshrc and making cron jobs and all that jazz (though systemd is subsuming all that shit nowadays). Lots of cool and powerful stuff with shells. Definitely worth learning command line stuff if you want to program, no matter which OS or programming language you use. But be warned: PowerShell is a joke compared to Unix shells like bash or zsh.

ActionScript? Flash is dead. Not worth your time.

Other languages that aren't on that list, but are worth mentioning:

Swift -- optionals/nils are interesting, also it's what you need to make iOS apps, since Objective-C is being phased out

Kotlin -- successor to Java as the go-to language for Android development

Haskell (for academic or hobbyist purposes)

Rust

Go

Assembly, such as MIPS or ARM or x86: if you wanna learn more about how computers really work, it can be useful to learn about assembly. You'll learn about registers and all that good stuff. It's a total pain in the ass. You'll appreciate higher-level languages more after doing shit in assembly. Assembly is very simple -- and that's the problem. It's hard to get an idea of how combinations of jumps and pushes do anything. Higher-level languages introduce extra layers of abstraction, so you can think more about the problem you're trying to solve and less about CPU registers and whatnot. A lot of compilers compile to assembly, because it's pretty hard to read or reverse engineer (though some languages compile to bytecode instead). But if you want to get into reverse engineering or malware, assembly is more important to learn. But it doesn't make sense to try and make a program in assembly as opposed to something like Java or C++ instead.

Markup and "that's technically not a programming language" languages: HTML5, CSS3, Markdown, YAML, LaTeX, preprocessors such as Sass, and so on. Still important to know. Might not be turing complete, but so what? Still useful to know. Pedants argue about what to call them, but regardless, you should know some of them anyway.

When some people ask "which programming language should I learn?" the answer is: many programming languages. You might only speak one language in daily life, but you might need to use multiple programming languages as a programmer. They have different design philosophies, different built-in methods, different libraries, run on different devices, and have different use-cases. They're not all general purpose, and even so-called general purpose languages are better for some things and worse for others.

There is no "best" language to learn, but stop obsessing over which one is best and just pick something. I'd suggest Python, HTML/CSS/JS, or Java to start with. When you learn your first programming language, you're learning programming, paradigm-specific stuff, and language-specific stuff. Then, for your second programming language (assuming it's in the same paradigm, which should ideally be OOP-based, even if it's not 100% OOP), all you're really learning is the syntax and language-specific stuff. It's way easier to learn another programming language after you've already learned your first one.

It's easy to learn programming, but only if you have realistic time expectations. If you think you're gonna make a game in a day, you'll be distressed by how complicated everything sounds. Rome wasn't built in a day. So you have to pace yourself, like learning linked lists one day, then stacks and queues, then binary trees the next, time complexity the day after that, and so on. That's not a really good order, just an example. But that brings up another topic: there is a difference between learning the basics of a language, and learning more in-depth topics, such as algorithms, data structures, software engineering, project management, best practices, devops/agile, tools, debugging, design patterns, etc.
>> No. 1699 [Edit]
>>1638
>if you really wanted a language that will never go obsolete try Assembly
Terrible advice, considering the limited use-cases of Assembly, and also how CPU architectures change over time. x86 might be hot now, but it's being overtaken by ARM. Different CPU architectures have different assembly code. I learned MIPS assembly in college, and it's pretty much useless. I don't even put it on my resume.
>> No. 1784 [Edit]
I'm aiming for a career in Mechanical Engineering so programming for me is going to be something I do to assist with prototyping. That is to say, it'll be primarily more of a hobbyist activity. On that note, I've decided to learn C first, then Forth, then Lisp and then Ruby.

C because I hear its simple yet powerful, "bare metal" programming language. I believe it also teaches you about how the computer works internally.
Forth because I hear it gets used in the Aeroplane and Space sectors of Engineering. It's even closer to the "metal" than C, I hear. I heard it gives the programmer immense control and presents its own unique problems to programming that challenge you to think differently.
Lisp because I hear it will induce some kind of revelation about programming when you come to understand it. I hear that it's perfect for when you don't know what you should be doing because Lisp code is flexible enough to be taken in a new direction with relative ease.
Ruby because I predict I'm going to want a more friendly language to code in for my own applications in the house when I'm not trying to be maximally efficient.
I also consider Ada and Fortran as possibilities.

My most recent completed project was a program that could print a list of all the palindromes from 10 to 1,000,000 without using string manipulation (since I have yet to learn it).
>> No. 1785 [Edit]
>>1784
I would recommend LISP first, as you will get a better understanding of the science in computer science and better illustrates benefits of doing things different ways (especially recursion).
C is great exactly for the reasons you list, but the manual memory manipulation crap and a lot of C's idiosyncrasies (including the compilers) will slow down the general concept stuff that you should be learning first and foremost.

I would also recommend python over ruby for a language following what you are trying to accomplish with it, as python is a lot better supported overall, and a much larger library of publicly available code to stealborrow from.
>> No. 1786 [Edit]
>>1785
Thanks for the support but unfortunately, I've already bought books on C to get me started with it. I'm going to press on with C just for the sake of reassuring myself that I didn't waste my money. Considering what you said though, I guess I could make Lisp second and I'll learn it along with watching those SICP videos on Youtube before going into Forth.

I've heard that Python is popular for being popular so I'm somewhat aware of how well supported it is. When I look at Ruby code though and how it's all so sterilised of all of the "computer" things that you usually find in software code, I feel really drawn to it. It seems really comfortable. I'll just have to keep Python in mind as well.
Thank you.
>> No. 1787 [Edit]
File 154943397236.jpg - (85.93KB , 706x455 , slap.jpg )
1787
learning c++ atm. I need to learn data structures and algorithms but i keep getting blocked with math and big 0.
>> No. 1955 [Edit]
File 157517725294.jpg - (201.97KB , 1920x1080 , fecd8526c650eba6709a3a9fe9c7666e.jpg )
1955
If you want a fun language to use, I recommend trying D. Its template system is awesome and a lot easier to use than the one in C++, as I recently discovered in a project that I'm working. Wish it was more popular, however. Its rough edges could be smoothed out if it had more manpower.
>> No. 1958 [Edit]
File 157596783467.jpg - (431.80KB , 1080x2160 , Screenshot_20191210_083838_com_termux.jpg )
1958
I can definitely recommend learning assembly. It helped me tremendously to get an intuitive understanding of how computers work. Many of the things in C++ and other languages that I had previously found hard to grasp (e.g. binary logic, bit shifts, flags, two's complement notation etc.) are like second nature to me now because of dealing with them all the time when writing assembly code.

As your first programming language, I recommend either learning something very high-level (Haskell, Python) where you can focus on learning algorithmic thinking without having to deal much with things like memory management, integer overflows and so on, and also start writing somewhat useful software early on; or start from the ground up with assembly so that you can focus on learning exactly how computers deal with binary numbers, character encodings, memory addresses, registers, vector operations, stacks, system calls and so on so that none of this will present a hurdle later on in your programming career.
Avoid starting with the languages in the middle (C/C++, Java, Rust) where you have to deal with both at the same time.
That said, if there's something very specific that interests you, learn whatever is most useful in that field: Javascript for web development, C for microcontrollers/Arduino, Shell-scripting if you're a Linux user, Java if you want to write Android apps.

Some decent Assembly books aimed at beginners that deal with modern processors are Jeff Duntemann's Assembly Step-by-Step, Assembly Language Coding in Color - ARM and NEON, and Programming From the Ground up. Or just get an emulator for an 80's computer like the ZX Spectrum or Commodore 64 and read one of the countless beginner's books on assembly for Z80 or 6502 CPUs.

A good book on higher-level/philosophical computer science concepts that doesn't require a strong mathematical background is Understanding Computation from O'Reilly. Other than that, just look at what universities teach in their curricula.

>>1631
C was arguably the first language that was a good abstraction of how a computer works without being specific to one type of processor. That's why it's great for writing things like operating systems and other things where speed is important, while still remaining portable across different computers. You can run Linux on both your desktop computer (Intel processor architecture) and your phone (ARM processor) because it's written in C. Can't do that in assembly because you'd have to rewrite it for every type of processor, and can't do that with LISP or Python because it's too slow. So everyone back in the day started writing all these fundamental systems (many of which are still around from the 70's/80's) in C, and it will basically never go away - unless Urbit really takes off and the feudal internet aristocracy of the future writes everything in Hoon.
>> No. 1959 [Edit]
File 15759893559.gif - (44.46KB , 640x400 , download (1).gif )
1959
>>1958
What do you think about MATLAB? The most complex thing I've done with it is matrices, structures and that kind of thing. That's basically all I know, though I've tried dabbling in c before using that shitty, learn c in 24 hours guide. Where should I go next? Algorithms? At what point do you really "know" a language and move on to learning something else? Also, have you heard of Introduction to Microcomputers series by Adam Osborne? I haven't finished it, but it gave a very interesting glimpse into the hardware side of things. You learned x86, right?

Intersting links I can't do anything with, but maybe you or somebody else could:
https://web.archive.org/web/20180630204922/http://island.geocities.jp:80/cklouch/column/pc98bas/pc98disphw_en.htm
http://seclan.dll.jp/dtdiary/1999/dt19990924.htm
http://euc.jp/articles/pc9800.en.html#chap5
https://archive.org/details/PC9800TechnicalDataBook1986
https://46okumen.com/pachy98/
I remember finding the 98 bible, but I don't have the link for whatever reason.
>> No. 1960 [Edit]
>>1958
If you want to start with assembly, maybe also take a look at riscv. The spec is pretty clean and since it's from the risc lineage the instruction set is self-contained and easy to understand. One of the projects in my uni was to build a cpu (in circuit simulation software) and I was surprised at how compact it ended up being. Unfortunately the tooling and the ecosystem is still somewhat janky at the moment, but it's worth looking into since there's a chance it might take off in the future.

>>1959
Not the poster you were responding to, but MATLAB's a cool language for any sort of numerical computing. It's starting to fade a bit now that there's numpy+python, but the lack of operator overloading sort of hurts python here. If you need one of the specialized toolboxes then there's no other real choice, but otherwise making the transition to numpy shouldn't be too unfamiliar, and it will be a good introduction to python.
>> No. 1961 [Edit]
>>1959
>Where should I go next?
I suggest just writing a lot of programs and looking up stuff on search engines and in reference manuals as needed.
Since we're in Christmas season, try doing the programming puzzles at adventofcode.com and see how far you can get in each year's calendar.
As your skills as a programmer progress, you'll also see more and more opportunities for contributing to open source projects opening up.
You'll eventually run into a roadblock due to lack of knowledge, and that's always a good pointer towards what you should learn next.

If you're looking to learn more about how CPU's work, check out nandgame.com and maybe the Nand2Tetris course on which it is based. It basically leads you through the process of building an entire computer from just above the transistor level.

Since you know some C, you could look into Arduino microcontrollers, which let you control electronics (think LED's, sensors, buttons, small LCD screens etc.), if you find that sort of thing interesting at all.

>Algorithms?
If you feel like it, sure. Can never know enough about that topic. It all depends on your goals and what interests you. As I said, looking at what universities teach in their curricula is a good guideline when you're not sure what to read about next.
>At what point do you really "know" a language and move on to learning something else?
I recommend sticking with whatever language you know until you want to solve a problem it is simply not suited for. If you want to make websites for example, C isn't the right tool (except for some back-end stuff).
If you want to learn a new language just for the heck of it, my suggestion is to pick a language that's suitable for understanding a different programming paradigm than what you're used to -- such as Haskell for functional programming, Ruby for OOP, and some sort of assembly language.

>You learned x86, right?
It's what I started learning assembly on, using Duntemann's excellent tutorial, but I never wrote much in it. Had to drop the book half-way through because I became homeless and didn't have access to a desktop computer most of the time.
The architecture I'm most familiar with is ARM, in both the 32-bit and the 64-bit variant. That's what my phone has, and I have to say that ARM assembly is much more readable and intuitive than its x86 counterpart, and not nearly as mind-bogglingly complex. I can echo >>1960's recommendation to pick a RISC architecture for learning how assembly works, and ARM (formerly Acorn RISC Machines) is one of those. Most of what I know about it is straight from the documentation on ARM's website.
The one I'm second-most familiar with is the Z80, which has good official documentation and is very simple, but is much less consistent and logical than ARM.
If you're interested in reverse-engineering Windows apps or something, you obviously won't get around learning x86 though.
>> No. 1962 [Edit]
>>1961
>If you're interested in reverse-engineering Windows apps or something, you obviously won't get around learning x86 though.
Well, I'd like to be able to do something with the pc-98 one day, but that's a bit of a pipe dream.
>> No. 1963 [Edit]
>>1961
>Z80
You might find this fun reading then:
http://www.chrisfenton.com/the-zedripper-part-1/
>> No. 1967 [Edit]
>>1962
>I'd like to be able to do something with the pc-98 one day, but that's a bit of a pipe dream.
Looks like a fun toy. Why do you think it's a pipe dream, given how much documentation for it is out there? Is it that you can't read Japanese?

Do you have real hardware or are you planning to do everything in an emulator?
>> No. 1968 [Edit]
File 157610422449.gif - (47.99KB , 640x399 , VGNyu1s.gif )
1968
>>1967
>Is it that you can't read Japanese?
Yep. I'm in the process of learning it. I'd also have to learn a substantial amount about the 98's dialect of x86, which isn't a walk in the park, and Japanese tech terminology, but a lot of that is hopefully written in katakana. If I do manage, it wont be any time soon. Learning how to program for the z80 first might be beneficial since it's similar and simpler.
>Do you have real hardware or are you planning to do everything in an emulator?
Emulator. Is there much advantage to working on real hardware?
>> No. 1969 [Edit]
>>1968
>Emulator. Is there much advantage to working on real hardware?
depends on the emulator. If the emulator is inaccurate, your software may behave in unexpected ways once someone does try to run it on real hardware.
On the other hand, emulators may have nice debugging features that are superior to whatever you could find on real hardware. Even just save-states are a blessing in this regard.
At the very least, making frequent backups of your work will be a lot easier.

Most of my experience with Z80 programming comes from writing shitty demos for the Sega Master System (which I can't recommend if your goal is to learn about Z80 programming, because the documentation isn't nearly as good as for the popular personal computers of the time). I was mostly using Emulicious, which has a useful debugger, but I'm fairly certain that none of the programs I wrote would even boot on real hardware. I'd have to change around at least a few things in the file headers, probably some of the actual code too.
>> No. 1970 [Edit]
File 157619777878.png - (144.83KB , 1366x768 , 1484437985663.png )
1970
>>1969
There's a branch of the Neko Project emulator which can run windows 98. I don't know how possible that is on any physical models. I do know different models have different specs and some stuff that would work on one might not on another. Emulators are better for experimental type stuff since you can adjust the specs how ever you want, even with greater capabilites than any real model of a system.

https://sites.google.com/site/np21win/
>> No. 1971 [Edit]
File 157620045928.png - (849.23KB , 2000x1125 , programming_challenges.png )
1971
>>1968
If you want to do something on the PC-98, why not start with doing stuff in BASIC?
http://worholicanada.mydns.jp/pc98/00303.html
>> No. 1972 [Edit]
>>1971
That image is a troll-post right? I mean they're good projects, but a few of them seem to have their difficulties completely off. E.g. why are "Game of Life" and "English sentence parser" both medium. The former is a straightforward recursive program while the latter is a relatively sophisticated NLP project (unless you just call into a pre-existing library). Similarly why is "text editor" hard but "javascript debugger" medium.
>> No. 1973 [Edit]
>>1971
>Design a Game Engine in Unity
A game engine within a game engine?
>> No. 2026 [Edit]
File 159089471993.jpg - (270.64KB , 500x650 , 2f502e6140df7ce9868c2f1b3db5f5a1.jpg )
2026
I'm reading how to design programs. It's a scheme book. There's a newer edition out for racket, but I started the edition I did before knowing that. It's far from my first exposure to programming, but it's the first time I'm learning it seriously. The exercises are tough and I have to look at the answer a lot. Recursion is tough. Mutual recursion is tough. I'm doubting myself a bit.
>> No. 2027 [Edit]
There are those who interview for a programming job but cannot implement fizz buzz or similarly trivial constructs despite graduating with a CS degree. And yet, these people do get jobs. Masturbation is the only thing left.
>> No. 2028 [Edit]
>>2027
Conversely you also have those with demonstrated experience who get asked gotcha brainteasers.
>> No. 2029 [Edit]
>>2028
Interviews are a joke.
>> No. 2030 [Edit]
Working is a joke.
>> No. 2031 [Edit]
>>2027
Do you think masturbation could help you in an interview? I have to try that one.
>> No. 2032 [Edit]
I want to hear people's opinions on Rust. Things like ripgrep have piqued my interest.

>>2031
Yes, there's no better way to assert your dominance.
On the other hand, you could end up as a sex offender.
>> No. 2033 [Edit]
>>2032
Ripgrep is very nice (in fact all tools by that author are very handy).
There was also a brief discussion of rust in /ot/ (http://tohno-chan.com/ot/res/33905.html#i35079)

I think there's a lot of neat ideas there from a PL theory perspective (enforced lifetime tracking) and a practical (succinct, helpful compiler messages). I'd like to see them make their way to c++ as well (llvm community is doing some work on improving static analyzers).
>> No. 2034 [Edit]
>>2033
I'm not sure any of this can be brought into cpp, the language has so much legacy and so many features so at this point it is nigh impossible to add anything without breaking at least *something*.
And PL theory, sadly, goes against good error messages, well Standard ML and OCaml have lean and mean error messages, while Haskell is just horrendous in this regard, and when you add advanced type level features into the mix. Well, you now can compare errors (at least in kilobytes) to ones you get from templates in cpp.
>> No. 2035 [Edit]
>>2032
Apparently Rust's type system is formalized via the notion of affine types, where every variable can be used at most once. There are also linear types where a variable can be used exactly once. Wikipedia gives C++'s unique_ptr as an example of a linear type, but to me it seems like an affine type instead since you can always choose to discard it (just let it go out of scope).

It's also not clear to me why they're called linear/affine.

https://en.wikipedia.org/wiki/Substructural_type_system
>> No. 2044 [Edit]
>>2035
They are called so because they came from Linear/Affine branches of logic, where you can use proofs once/at most once.
>> No. 2051 [Edit]
File 159649336968.png - (62.14KB , 870x666 , code.png )
2051
I made this hashmap for up to 8 characters in c by deferencing the strings.
Probably useless but I think it's pretty funny.
>> No. 2052 [Edit]
>>2051
Oh I think I understood what's going on there. I was confused at first because you mentioned hashmap, but what it's doing is re-interpreting the sequence of bytes "Cat....." or "Hello..." as an int64, which can be thought of as a pseudo-"hash". It's more like a fixed lookup table, and an interesting way of working around the fact that C doesn't support switch statements on strings.
>> No. 2053 [Edit]
>>2052
Oh yeah I didn't even map any values.
No matter. I just looked at the assembly and no matter how many cases I add it still ends up being just a series of if statements, so it is completely useless!
>> No. 2054 [Edit]
>>2053
Did you compile with -O2/-O3? I'm pretty sure that past some point compilers will use binary decision trees for the branching instead of sequential conditionals. But again there's not much point to this as you're better off using a proper hash function anyway.
>> No. 2055 [Edit]
>>2054
The optimization flags do actually make it work, thanks for that.
Yeah it's pointless, I just think playing with pointers is fun.
>> No. 2063 [Edit]
I decided I wanted to make an elaborated strip poker game where the other players are JCs and JKs. Then I realised the hardest part isn't programming but making the art. Very sad.
>> No. 2069 [Edit]
I'm creating a CLI program that downloads manga chapters from MangaDex. As of right now, one may specify criteria for determining which chapters of manga to download. Qualities such as chapter #, volume #, language (to which the chapter was translated), and groups' names.
One may also provide a template or output mask for the downloaded chapter archive's filename. For example, the default output mask is "{title} - c{chapter} (v{volume}) [{groups}]"; thus, given the first English chapter of Forget-me-not, the resultant filename would be "Forget-me-not - c001 (v01) [Hanashi].cbz". (Currently, zip files are the only supported format.)
Further, one may set the program's user agent and delay between requests. The initial option for the latter is two seconds to ensure that one is not blocked.

After I add support for different packaging, packaging by volume, finalizing the CLI, and providing helpful end-user documentation, I plan to refactor and rewrite a good portion of the code. One module is needlessly complex and template-heavy, and other files need better documentation. If anybody would like to try using it, please let me know! As you can infer, the software is still in development, but I've used it a few times for my own archival needs.
>> No. 2070 [Edit]
I also need to determine how the program responds to overly long filenames on Windows. Considering that a manga's (or chapter's) title will be the usual culprit, I believe that shortening that and adding ellipsis would be a decent solution. (One may specify a setting in the group policy or the registry to remove the path limitation, but that seems burdensome for the end user.)
>> No. 2071 [Edit]
>>2069
> needlessly complex and template-heavy
By template-heavy you don't mean C++ templates, do you? As much as I hate to be the one suggesting languages, this seems like a place where python would shine given the ease with which you can parse webpages in it.
>> No. 2072 [Edit]
>>2070
Why would that be necessary? You can hover your mouse over any file name and see the full thing in a box that appears.

Post edited on 19th Oct 2020, 6:35am
>> No. 2073 [Edit]
>>2071
>By template-heavy you don't mean C++ templates, do you?
I'm using D whose templates are actually programmer-friendly, and it's only template-heavy because I wanted to test some ideas.

>this seems like a place where python would shine given the ease with which you can parse webpages in it.
Python would probably be a fine alternative, but I'm directly calling MangaDex's APIs; only JSON must be parsed, and D has that capability in the stdlib. Even if I must deal with HTML, there is an awesome D library that implements much of the JavaScript DOM library and interface.

>>2072
>Why would that be necessary? You can hover your mouse over any file name and see the full thing in a box that appears.
Unless I'm misunderstanding you, from my experience, if a file's path (i.e. filename + the folder hierarchy in which it's nested) is too long, you may not meaningfully interact with it. A few times in a past, I had to boot my PC into a Linux environment so that I may rename, move, or delete the offending files.

Post edited on 19th Oct 2020, 7:36am
>> No. 2074 [Edit]
>>2073
Ah neat, I've played around with D and it seemed quite nice – although I haven't been able to find a personal niche for it in my own work. I also didn't know mangadex had an api!

With regard to the path limits, I recall reading somewhere that even if you don't flip the registry flag to enable long paths globally, there's a way to call into win32 apis directly and force use of long paths via some suffix. I have done zero win32 development though so I can't comment much further on that though. If it's a significant enough issue maybe you could just target linux and use WSL to run it on windows?
>> No. 2075 [Edit]
>>2074
>Ah neat, I've played around with D and it seemed quite nice – although I haven't been able to find a personal niche for it in my own work.
Yeah, I feel its general-purpose nature is both a blessing and a curse. Its meta-programming capabilities is pretty nice, though.

>I also didn't know mangadex had an api!
As did I. My initial client implementation parsed the webpages, but after a cursory glance in my web console, I discovered its existence. I do wonder how long it's existed.

>With regard to the path limits, I recall reading somewhere that even if you don't flip the registry flag to enable long paths globally, there's a way to call into win32 apis directly and force use of long paths via some suffix.
You are correct: one prefixes the filename with a sequence of characters to bypass the limitations. However, if I read the docs correctly, there's some quirks with it. It'll take some experimentation.

>If it's a significant enough issue maybe you could just target linux and use WSL to run it on windows?
I don't think it'll come to that. Abbreviating the filename or applying the filename-prefix should be suitable. Plus, Windows is my main driver, and I'd like to have this program run natively.
>> No. 2082 [Edit]
I've been trying to conjure a design by which structs (i.e. aggregate value types) may be dealt with like classes and interfaces. An obvious answer is structural typing via meta-programming. However, tunnel vision is quite potent.
>> No. 2085 [Edit]
>>2082
> structural typing via meta-programming
Can you explain what you mean by this? For simulating OO in C via structs, the solution I've usually seen involves including the base class as the member of the derived classes so you can manually cast back and forth, and then essentially manually implementing the vtable to get the polymorphism.
>> No. 2087 [Edit]
>>2085
What I mean is that, given a function that has a parameter of type T, only operate on a subset of members specified by T; as long as a struct defines those members, then from the viewpoint of the function, it's considered equivalent to other types that do the same. (In the light of this description, I retract my solution's description: it's closer to duck typing than structural typing.)
>> No. 2088 [Edit]
>>2087
Yeah ok that makes sense. It's annoying in C though because you also need the same layout of the structs, which is why as I mentioned most people just include the base struct as the first member.
>> No. 2089 [Edit]
>>2088
I assume you're referring to something like this, right? (Sans encapsulating the parent's fields.)

#include "stdio.h"
#include "string.h"

struct Widget
{
int id;
};

struct FooWidget
{
int id;
char* text;
};

void process(struct Widget *widget)
{
printf("%d\n", (*widget).id);
}

int main(void)
{
struct FooWidget foo;
memset(&foo, 0, sizeof(foo));
process((struct Widget*)&foo);
}

>> No. 2090 [Edit]
>>2089
Yeah exactly that's the idea. Although in the approach I mentioned you would do


struct FooWidget
{
struct Widget base;
char* text;
};


so that way you don't have to repeat all of the parent's members (and it also avoids issues regarding struct packing/alignment). A lot of codebases I've seen will in particular do this for logging, where all of the "inherited" classes will share the same first member and then the "logId()" macro or whatever can just case to that shared "base" that is the first member and extract out the id.

You can also go further and implement function polymorphism, not just member sharing, by manually passing around vtables like in the below example (since there's just one function I don't have a separate vtable member – I just put the function inline).


struct BaseWidget {
int id;
void (*dump)(struct BaseWidget *self);
};

struct ExtendedWidget {
struct BaseWidget base;
char* extra;
};

void dumpBase(struct BaseWidget *self) {
printf("BASE: %d\n", self->id);
}

void dumpExtended(struct BaseWidget *self) {
dumpBase(self);
printf("DERIVED: %s", ((struct ExtendedWidget*) self)->extra);
}

void dump(struct BaseWidget *widget) {
widget->dump(widget);
}

int main(int argc, char *argv[]) {
struct BaseWidget base = {.id = 3, .dump = dumpBase};

struct ExtendedWidget derived;
derived.base.id = 4;
derived.base.dump = dumpExtended;
derived.extra = "foobar";

struct BaseWidget *baseThatIsExtended = (struct BaseWidget *) &derived;

dump(&base);
dump(baseThatIsExtended);
}

>> No. 2091 [Edit]
>>2090
Neat. But Haruhi damn, I hate C's syntax for function pointers.
>> No. 2092 [Edit]
This is why nobody pays me to program.


struct MaskContext(string name, Placeholders...)
if(Placeholders.length > 0 && allSatisfy!(isPlaceholder, Placeholders))
{
alias RequiredPlaceholders = Filter!(isPlaceholderRequired, Placeholders);
alias RequiredParams = staticMap!(PlaceholderType, RequiredPlaceholders);
alias AllParams = staticMap!(PlaceholderType, Placeholders);

/// Constructor for all placeholder fields.
this(AllParams params)
{
static foreach (i, P; Placeholders)
{
__traits(getMember, placeholders, P.identifier) = params[i];
}
}

static if (RequiredPlaceholders.length > 0)
{
/// Constructor for only required placeholder fields.
this(RequiredParams params)
{
static foreach (i, P; RequiredPlaceholders)
{
__traits(getMember, placeholders, P.identifier) = params[i];
}
}
}

// ヽ( ̄~ ̄ )ノ
}


And yet, it works!
>> No. 2093 [Edit]
Due to circumstances, I've returned to C++ after many, many years, and I must say that I have no idea what the fucking I'm doing. Groking its template metaprogramming is difficult after enjoying D's relative simplicity; no universal implicit initialization, move semantics, and ugly syntax are a thorn in my side; and no modules (for GCC, anyway) kills the soul. And yet, I'm having fun (with valgrind by my side). Plus, I get to re-enjoy Scott Meyers' talks and writings--always a good time.
>> No. 2094 [Edit]
No built-in unittesting is saddening, too.
>> No. 2095 [Edit]
>>2093
There is now "concepts" with C++20, it helps with templates a lot.
>> No. 2096 [Edit]
>>2095
Indeed. Template constraints are a great feature in D, and it seems concepts might be more powerful. However, as usual, C++'s take seems rather ugly.
>> No. 2097 [Edit]
>>2096
I wish SFINAE (and the hell that is has enabled) would never have existed.
>> No. 2098 [Edit]
>>2097
It's certainly antiquated now, it seems.

Also, consider this:

template<typename... Args, typename LastArg, typename = void>
void foo(LastArg arg)
{
// ...
}
foo<int, float>("Hello, world!");


I'm glad type inference with variadic template parameters is possible, but it's so odd. Cursory searches haven't revealed much about "typename = void", and cppreference (from where I learned this) doesn't go into detail.

Meanwhile, in D:

template foo(Args...)
{
void foo(LastArg)(LastArg arg)
{
// ...
}
}
foo!(int, float)("Hello, world!");


Readability at the cost of two template instantiations (unless this can be optimized), but I prefer it.
>> No. 2099 [Edit]
At first I was excited about constexpr, but it's stupidly limited: only "literal types" are supported, and "if constexpr" must be placed in function scope. So if you want a compile-time std::string (Working with char{*|[]} kills the soul.) or a replacement for the pre-processor, you're out of luck. Instead, I have to conjure up some tricks to workaround these issues, and even that's not satisfactory. And here I thought C++ was catching up to D.
>> No. 2107 [Edit]
>>2099
With C++20 most of the limitations with constexpr are fixed. (std::string and std::vector work now too.) There is also "constinit" and other new features. Read through them.
>> No. 2108 [Edit]
>>1958
>LISP is slow
I hate this stigma that Lisp is somehow "slow" when it's absolutely not. SBCL can already produce images that are as fast, if not faster, than GCC if you can be clever enough. Now I will say writing Lisp to be as fast as C is a major pain, if you want to write fast code you should use Chicken (which lets you drop down into C at anytime) or just use C.
I think this idea of Lisp being slow comes from it being a LISt Processor where everything is a "linked" list, and that these lists have an O(n) access time. Honestly with today's machines (2000s and on), I would think that they're fast enough to compensate for this, not to mention that most dialects allow you to use vectors for when you're dealing with a truly large amount of data.
One more think I would like to add, that really gives Lisp the edge over most languages, is that programs are treated the same as regular data; that is programs can be manipulated just as regular data can. Long story short, Lisp machines, and Lisp instruction sets/architectures are near trivial to design and give the programmer, and user, some major benefits (not just speed). If you want to read more on this, I would suggest Guy Steele's paper "Design of LISP-based Processors".
>> No. 2109 [Edit]
>>2108
SBCL is pretty amazing. You can see this quantitatively in [1] where Lisp is within an order of magnitude of C's performance. In fact a lot of people's ideas about "fast" languages are out of date. I've heard people call Java a "slow" language, but it's really quite performant (thanks to a lot of effort put into hotspot jit).

[1] https://thenewstack.io/which-programming-languages-use-the-least-electricity/
>> No. 2110 [Edit]
>>2107
Thanks for the information. I had assumed that my toolchain was limited to C++17, but it seems GCC 10 is supported. Pretty excited to see how much of the pre-processor I can replace. The dream, however, is to convert the platform's system headers to D modules, and get GDC working. Don't know if I have the knowledge for the latter, though.
>> No. 2111 [Edit]
What do you guys think about a function that reads command-line options into a struct? The following is its documentation:


Parses command-line arguments that match the given `CLIOption`s into a struct and returns it.

Params:
- Options = A sequence of `CLIOption` instantiations.
- args = The command-line arguments passed into the program.

Returns: A struct composed of two fields: `values` and `helpWanted`. The former is another struct whose fields' identifiers and values are derived from the passed `CLIOption` instantiations. The latter signals whether -h|--help were specified--just like with `std.getopt`.


P.S. I wish we had a code tag, e.g.
[code][/code]


Post edited on 25th Nov 2020, 6:32pm
>> No. 2112 [Edit]
>>2111
I'm not sure if I fully understand what you're going for. Can you dynamically create fields in a struct? And what would be the advantage over returning a dictionary(/map)?

Incidentally I wish that all languages had something like Python's argparse. That's always been a pleasure to use and it handles all the common use-cases (required flags, optional flags, lists, etc.)
>> No. 2113 [Edit]
>>2112
>Can you dynamically create fields in a struct?
Fields are "mixed in" at compile-time. So the type is fully defined at runtime.

>And what would be the advantage over returning a dictionary(/map)
Since I'm programming D, and D is statically typed, the value type of a dictionary would have to be a variant--which would introduce some friction. I could also hack together a solution with `TypeInfo`, but I'm not too keen on that.

>Incidentally I wish that all languages had something like Python's argparse.
Never used it as I rarely program in Python, but it does seem nice after reading the docs. I'll have to borrow some of its ideas.

My `parseArgs` function is built upon D's std.getopt, as the latter doesn't promote structure, in my opinion.


/**
Usage: calc numbers [options]

Takes a list of numbers separated by whitespace, performs a mathematical operation on them, and prints the result.
The default operation is addition.
*/

// Usually a bad idea like `using namespace std;`
import std;

// Default Value | long and short names | Optional argument that specifies the option's description in the help text.
alias OperationOpt = CLIOption!("add", "operation|o", CLIOptionDesc("The operation to perform on input (add|sub)"));
// Same as above except we specify a type instead of a value. The option's default value will resolve to its type's initial value, which would be `false` in D.
alias VerboseOpt = CLIOption!(bool, "verbose|v", CLIOptionDesc("Prints diagnostics"));

// -h|--help are automatically dealt with
auto result = parseArgs!(OperationOpt, VerboseOpt)(args);
if (result.helpWanted) { result.writeHelp; return; }
auto nums = args[1..$]; // Let's just assume that the user actually entered in at least one number.

// An option's long name is the identifier in `values`. The implication is that long names must be also be D identifiers. However, I've ensured that common option names like `long-name` are resolved to `longname`. However, more bespoke option names will trigger a compiler error with a helpful message. This would not be a problem if `values` were an associative array whose keys are strings.
switch (result.values.operation)
{
// Assume the variadic functions, `add` and `sub`, are defined.
case "add": add(nums).writeln; break;
case "sub": sub(nums).writeln; break;
default: writefln!"Operation '%s' is not supported"(result.values.operation); result.writeHelp;
}


Three problems with my function and its associated templates:
1. I'd like `CLIOption` to take functions as a type. `std.getopt` can do this, but I've had issues creating a higher-level interface with this in mind. This is mostly due with how I designed things.
2. `parseArgs` should handle more than options, like `argparse`. After all, if it doesn't, mine should merely be called `parseOpts`.
3. I suck at programming.
>> No. 2114 [Edit]
>>2113
Ah neat that makes sense. Having not used D before, I was only vaguely aware of mixins. (It seems the definition of "mixin" being used here is slightly different than the conventional definition used in object-oriented languages? I've seen mixins in e.g. python/scala and there it's more akin to interfaces with default methods. But in D it seems it's a bit broader and more like templates, with support for compile-time preprocessing?)

>the value type of a dictionary would have to be a variant
Yeah most of the argument parsers I've seen in C++ deal with this by requiring you to manually cast any values that you access into the proper type. (There's also the gflags/abseil style argument libraries where you declare the variable you want to place the result into upfront. That works around the above issue, but on the flipside it's ugly and overkill for small projects). Creating a properly typed struct at compile-time would be a lot cleaner and safer.
>> No. 2115 [Edit]
>>2114
D has two types of mixins: string and template. The former embeds a string containing valid D statements and/or expressions into the code: `mixin("int foo = ", 42, ";");` -> `int foo = 42;`. This must be done at compile-time, and any variables passed into the `mixin` statement must be readable at compile-time.
Then there's template mixins; these are more like traditional mixins found in OOP languages, except, as you've mentioned, they may be parameterized with types, symbols, and values. They are "mixed in" with the `mixin` statement: `mixin SomeDefinitions!42;` If `SomeDefinitions` had a definition, `int foo = value`, where `value` is a template parameter, then said definition will be accessible from the scope in which the template was mixed, and `value` is substituted for `42`. This is in contrast to a normal D template where its definitions, after instantiation, reside in their own scope accessible through a symbol.
These given examples are rather trivial, and don't do these features justice. For command-line argument parsing library, I use string mixins to generate new structs at compile-time, and utilize mixin templates to factor out common definitions and compile-time processing. Further, there are D CGI libraries that expose template mixins that do all the setup drudgery, e.g. provide a `main()` and environment variable processing.

As an aside, D allows you to defines strings with `q{}`, where the strings' contents are placed between the curly braces. This indicates to a text editor, IDE, or whatever to treat the strings' contents as if it were D code (or any code, I suppose): highlight it, provide code inspection capabilities, etc. These are helpful with string mixins.

>(There's also the gflags/abseil style argument libraries where you declare the variable you want to place the result into upfront. That works around the above issue, but on the flipside it's ugly and overkill for small projects).
I looked at them. I feel a little sick.
>> No. 2116 [Edit]
File 160688616514.jpg - (108.27KB , 1280x720 , [Doki] Mahouka Koukou no Rettousei - 10 (1280x720 .jpg )
2116
Alright, so I'm re-working that argument parsing thing, and funnily enough, template mixins have been a big help in refactoring. Combine that with better and more succinct solutions to previous problems, the design is a lot cleaner. Documentation is better, too. With that said, I'm not sure the best way to handle options' "optional" metadata:

alias VerboseOpt = Option!("verbose|v", false, OptionDesc("Garrulous logging") /* etc... */);

`OptionDesc` is one such piece of metadata. Right now, the `Option` template will pass the given variable-length list of metadata to a mixin template that will then define the appropriate fields. Thus, in the given example, a field of type `string`, whose identifier is `desc`, and with a value of "Garrulous logging" will have been defined in this instantiation of `Option`, i.e. `VerboseOpt`. The problem is that `parseArgs` will have to do some compile-time inspection on every `Option` instantiation to determine whether it has a description, i.e. a `desc` field; using the data therein or providing default values in the field's absence. This is not ideal for compilation times and for the code's clarity as this also extends to other pieces of metadata like `OptionCategory` or `OptionRequired`. It's not terrible, but again, not ideal. I have a better solution in mind, but a clean implementation of it is difficult for my moronic mind.
>> No. 2117 [Edit]
File 160705909858.jpg - (185.24KB , 1280x720 , !.jpg )
2117
Continuing my work on my command-line argument processing library (Now called "tsuruya" because naming is hard.), I have realized happiness through the digital world instead of just the 2D one.
Here's an example:

auto args = ["./test", "1", "2", "3"];
auto result = args.parseArgs!(Parameter!("integers", int[]));
assert(result.parameters.integers == [1, 2, 3]);
assert(result.usageText == "Usage: test <integers>");

`parseArgs` is instantiated with a `Parameter` struct template whose name, both in the generated programming interface and command-line interface, is "integers". By specifying the type of the parameter's value as a dynamic array of integers, `parseArgs` will read all non-option command-line arguments; convert them to `int`; and then add them to the parameter's array. (As an aside, if one were to specify a static array, `parseArgs` will only read k-number of non-option command-line arguments, where k is the static array's length.) A usage string is also generated based on what parameters and options (collectively known as "command-line interface objects") were given to `parseArgs`.
`Parameter` may also take a callable object, e.g. function, instead of a type, and the value it expects will be that of the callable object's return type. Further, one may pass optional metadata to `Parameter` just like one may do with `Option`, e.g. CLIODesc and CLIORequired. The former defines a description for a command-line interface object that may be used in `parseArgs`'s generated help text. The latter specifies whether the parameter or option is, well, required to be in the command-line arguments.
>> No. 2118 [Edit]
>(collectively known as "command-line interface objects")
I scrapped this stupidity and renamed the `Parameter` templates to `Operand`, since that's what they actually represent. After all, a parameter would include options too and thus confusion. Anyway, on to error handling and all that fun that entails.
>> No. 2119 [Edit]
Oh how I wish for mutability during compile-time. The amount of recursive templates upon which I'm relying is making me sweat a bit.
>> No. 2193 [Edit]
I was trying to get a program I always use to do something for python 2.7 and it wasn't supported anymore. Looking up the changelog discussions, I saw a poster say "We shouldn't support such ancient distros". Christ... it's really bizarre to me just how much the attitude among programmers is now. Granted, decade old software tends to be forgotten, but I have a hard time thinking of 2010 as "ancient", even as far as tech goes. Guess this is just me griping, but damn. I thought python 3.3 and 2.7 were still being used on the same systems.
>> No. 2194 [Edit]
>>2193
What a mess the python 2->3 transition was. Whose boneheaded idea was it to make things non-backwards compatible.
>> No. 2195 [Edit]
>>2194
>Whose boneheaded idea was it to make things non-backwards compatible.
I don't know, but there's a growing philosophy that old digital technologies should be forcefully cut out from any currently updated projects. Windows 10 for example has some serious fundamental flaws that make windows 7 look comparatively like a masterpiece, yet it's being prioritized so heavily that now people are cutting windows 7 support from their projects. This in particular is infuriating, to especially because when I'm not on a linux machine I want to use windows 7. In my brief stint with windows 10 I discovered some horrific design flaws regarding path variables, registries, and worst of all administrator permissions. As it turns out it is relatively easy on windows 10 for a file to revoke absolutely and forever any access to any users including the system user itself. This is, particularly, unpleasant when said file is malware.

View catalog

Delete post []
Password  
Report post
Reason  


[Home] [Manage]



[ Rules ] [ an / foe / ma / mp3 / vg / vn ] [ cr / fig / navi ] [ mai / ot / so / tat ] [ arc / ddl / irc / lol / ns / pic ] [ home ]