[ home / rules / faq ] [ overboard / sfw / alt ] [ leftypol / siberia / edu / hobby / tech / games / anime / music / draw / AKM ] [ meta / roulette ] [ cytube / wiki / git ] [ GET / ref / marx / booru / zine ]

/tech/ - Technology

"Technology reveals the active relation of man to nature" - Karl Marx
Name
Options
Subject
Comment
Flag
File
Embed
Password (For file deletion.)

Join our Matrix Chat <=> IRC: #leftypol on Rizon


File: 1689566441325-0.png (77.42 KB, 800x1124, ClipboardImage.png)

File: 1689566441325-1.png (224.94 KB, 618x592, ClipboardImage.png)

 No.20923

Fucking KNEEL. You will die but C will live. C wont die out because of embedded systems. So long as there is electricity, C will be forever. C is used in kernels. I like to see fools who think C will die. C prioritizes the purity of atomic control, raw power, and a small and manageable feature set because that's what the rest of the computing world needs as a foundation. A feature not provided can't be flawed or slow - the fewer features are crammed into C itself, the less opportunity there is for bugs to creep in beneath the very feet of every OS and many interpreted languages at the same time. Python is the worst language to start with because it promotes bad practices and doesn't really teach you anything not to mention dynamic types make it hard to understand certain concepts like how data is actually represented and it creates this illusion that basically lie to people as to whats actually happening. having that much disconnect from processors that do very simple things to accomplish a specific task and over-engineering solutions because it conforms to language syntax is just stupid. there is no better starting point than C. C syntax is explicit and clear and actually lets you do exactly what you want to accomplish conforming to the actual hardware and not some made up language syntax. its why c will never truly be replaced. The current languages that are shilled in shillicon valley are all terrible. JavaShit is a complete disaster and we are stuck with it and that's the reason we have tons of dependencies and libraries with very sketchy glue code. OOP is also bad as it has a lot of hidden allocations that happen with constructors and make compile times more complex and harder to optimize or debug compared to simple functional direct code without a million rules that you have to follow giving you no real freedom to design software. There are reasons to redesign and make custom engines for handling specific tasks. Most languages cant exist without C, as their runtimes are all basically C, and it's the language that runs other languages and provides the abstractions people use. When you no longer have to think about hardware or what your code is doing that leads to ruin and slow sub-optimal code and libraries and dependencies. Python is ugly and hard to read. you have absolutely no idea what most of those libraries are actually doing and you lose sight completely of programming and live outside of reality. C syntax is explicit and clear and actually lets you do exactly what you want to accomplish conforming to the real fucking hardware and not some made up language syntax. Anything that's competed with C in the past hasn't been comparable in performance. Languages like Java and Go never even had a chance because of their stupid garbage collectors. Garbage collection is for literal retards who can't clean up after their own data. C is such a simple language that you could learn the syntax in a day, learn the important parts of the standard library in less than a week, even pointers.

Becoming a programmer these days is like coming in to a movie theater halfway through the movie. Computer Science history weighs heavy over everything. After decades of messing around with computers and programming I understand the approach of learning things in the order they were invented- everything builds on what came before. Everyone should learn C first if they want everything that follows to be many time easier- because you will have context and won't need to constantly ask: "Why TF did they do it that way?" you'll already know.

 No.20924

File: 1689578548656.png (1.93 MB, 1920x1080, ClipboardImage.png)

>>20923
lmao is this how c/c++ bros justify not being able to do basic cross-platform networking without overly convoluted external libraries
>you could learn the syntax in a day
but you can't use it for anything useful without learning other libraries first
>hurr durr optimization!!!11!1
sure if you're programming for microcontrollers or something. insignificant otherwise

oop is essential for most modern programs. same with basic features such as memory safety. your mental gymnastics won't help you when it's time to debug your unreadable behemoth of a program

 No.20925

>>20924
>oop is essential for most modern programs.
Not pure OOP. In fact, a growing trend is to minimize object orientedness. Something which I advocate for as well :)

Most production programming languages support objects and class constructs. At least I advocate for learning from Scala and porting some of the learnings there to any other OOP-heavy language.
>your mental gymnastics won't help you when it's time to debug your unreadable behemoth of a program
Oh boy. Objected oriented programming is also responsible for messes that not even God himself can parse.

 No.20926

File: 1689582103262.jpg (85.49 KB, 1280x474, DDlV1yyWsAIEhvt.jpg)

that's cute. you're like a little baby to me

 No.20927


 No.20928

>>20927
Heh.
As always in software development, it's all about choosing the right tool for the job.

Most of the work I do is backend web API development. C is objectively the wrong language to use here for basically all use cases. If you're a hobbyist, then there's no wrong language. Someone made an http server with forth, and they run a really cool website.
https://www.1-9-9-1.com/

They even quote Frederic Jameson.
<You're using Gforth, which came out in 1992. Also, it's 2017.
>Okay. But Fredric Jameson establishes that in postmodernism we have experienced a weakening sense of historisity such that what is, what was, and what will be all exist as presents in time. 1970, 1991, 1992, and 2017 all happen simultaneously. Hence developers working on new projects while still coding in decades-old text editors. They write the future in the past and are made present in so doing.

For me the best production backend web api language out there, as I mentioned, is Scala. It has huge issues, don't get me wrong, but it's more good than bad. Unfortunately, you also need to avoid the temptation of using the fancy type system in most cases.

If you just use Scala as a polished java then there's little reason to use java over scala, in most cases. Spring boot sucks btw. Yeah, it does fucking everything, but that's not a good thing. There is no good excuse in using java over kotlin.

As a hobby, I would rather be building stuff I'm somewhat confident works at compile time than busting my balls debugging, but a good friend loves to program in C/C++ and I totally get it. Haskell is also fun, but I would rather not use it than having to deal with the atrocious build system.

 No.20929

>>20928
/g/ memes aside, I feel I should add that while C has many problems it also has a very mature toolset. you can formally verify that your C programs are free of bugs using tools like Frama-C. it takes a long time, but still

 No.20930

File: 1689593584730.png (4.27 KB, 153x153, zig.png)

Is it a new /g/ copypasta? There is some truth and some retarded stuff in there.

I agree that C code will continue to run long after all of us browsing this website will be dead.
It's a relatively simple language in terms of features, yet it's incredibly powerful in terms of what you can achieve with it.
The kernels of most operating systems are written in C. The runtimes of most garbage collected languages are written in C. There is a C compiler for almost any computing platforms and processors.
Learning C is a must for any programmer who wish to be good at their craft and understand how our computing environments are implemented. For example, Java/C++ like OOP is nothing but a bunch of pointers to structs, functions and unions under the hood, and some void* manipulation wizardry for generics.

However, C is not a good first language to learn.
It's a good 2nd or 3rd language, but the first language I tried to learn was C and I was quickly discouraged, in part because of the old-school build chain.
Python is a good first language, because you can rewrite and execute code quickly, it's immediately useful with its large amount of libraries, and its type system, despite being dynamic, is more sound than C. Python is strongly typed, it has TypeError exceptions, while C is weakly typed, i.e. you can implicitly convert a pointer value to a signed long and your program will compile. The only mainstream language with a worse type system than C is Javascript.
Also, using C correctly is hard, you can segfault very easily, or make a program who needlessly use gigabytes of RAM because you did something retarded like using malloc() in a loop and forgot to call free().
Let's not even talk about the "undefined behavior" rabbit hole.

I've tried the Zig programming language recently, and it seems to be a really promising alternative to C for low-level programming.
It's easy to install on Windows, Linux, MacOS and FreeBSD, things like "defer" simplify your life a lot, the compiler complains more about unsound algorithms and type conversion, the language documentation (without the stdlib) is only one HTML page.
The build system uses the language itself, instead of arcane stuff like Makefiles and CMake, and can relatively easily integrate C and C++ libraries into Zig programs, especially C libraries. You can even use Zig solely as a build system for C and C++ software.
The only problem is that the language is very unstable so far, you can't compile 0.10.1 code with the newer 0.11.0 versions without changing your code. Tutorials from 2 years ago may not work anymore. It's the most frustrating part of using Zig; also some prior knowledge of C is recommended IMO.
But when it works, it's much more fun than using C (and C++) because you don't have to avoid so many footguns. I would rather use it for the few low-level hobby projects I have in mind than C/C++ if I can. One guy working on OpenJDK said it's the best designed language he has seen since Scheme (https://news.ycombinator.com/item?id=24292760), so it makes me hopeful about Zig's future.

 No.20932

I'm so glad I grew out of the language wars.

 No.20933

File: 1689614294358.webm (3.19 MB, 240x240, kneeltoC.webm)

>>20930
>Is it a new /g/ copypasta? There is some truth and some retarded stuff in there.
Good guess anon. But not quite. I was on a fireship video titled "C in 100 seconds" and was laughin at all the comments arguing over C and took some of them and mashed them together into a funny copypasta for the sake of performance art. It's something I do sometimes. I'm weird.

 No.20934

>>20933
>he doesn't know C also lies to users
asm and microcode gang
also gotta post this classic

 No.20936

File: 1689628196429.png (377.81 KB, 1285x1285, ClipboardImage.png)

>>20934
amazing
they even did the guitar solo

 No.20937

>Python is the worst language to start with because it promotes bad practices and doesn't really teach you anything
angry Python NPC face: "0.2 + 0.2 = 0.5"
>The current languages that are shilled in shillicon valley are all terrible.
what about Rust? I heard its fun in "safe mode" but unsafe mode is cringe

 No.20939

>>20936
>>20934
since you enjoyed it so much…

 No.20940

>In an announcement that has stunned the computer industry, Ken Thompson, Dennis Ritchie, and Brian Kernighan admitted that the Unix operating system and C programming language created by them is an elaborate April Fools prank kept alive for over 20 years. Speaking at the recent UnixWorld Software Development Forum, Thompson revealed the following:
>
>"In 1969, AT&T had just terminated their work with the GE/Honeywell/AT&T Multics project. Brian and I had just started working with an early release of Pascal from Professor Niklaus Wirth's ETH labs in Switzerland and we were impressed with its elegant simplicity and power. Dennis had just finished reading "Bored of the Rings", a hilarious National Lampoon parody of the great Tolkien "Lord of the Rings" trilogy. As a lark, we decided to do parodies of the Multics environment and Pascal. Dennis and I were responsible for the operating environment. We looked at Multics and designed the new system to be as complex and cryptic as possible to maximize casual users' frustration levels, calling it Unix as a parody of Multics, as well as other more risque allusions. Then Dennis and Brian worked on a truly warped version of Pascal, called "A". When we found others were actually trying to create real programs with A, we quickly added additional cryptic features and evolved into B, BCPL and finally C. We stopped when we got a clean compile on the following syntax:
>
>
for(;P("\n"),R--;P("|")) for(e=C;e--;P("_"+(*u++/8)%2)) P("|"+(*u/4)%2);

>
>To think that modern programmers would try to use a language that allowed such a statement was beyond our comprehension! We actually thought of selling this to the Soviets to set their computer science progress back 20 or more years. Imagine our surprise when AT&T and other US corporations actually began trying to use Unix and C! It has taken them 20 years to develop enough expertise to generate even marginally useful applications using this 1960's technological parody, but we are impressed with the tenacity (if not common sense) of the general Unix and C programmer. In any event, Brian, Dennis and I have been working exclusively in Lisp on the Apple Macintosh for the past few years and feel really guilty about the chaos, confusion and truly bad programming that have resulted from our silly prank so long ago."
>
>Major Unix and C vendors and customers, including AT&T, Microsoft, Hewlett-Packard, GTE, NCR, and DEC have refused comment at this time. Borland International, a leading vendor of Pascal and C tools, including the popular Turbo Pascal, Turbo C and Turbo C++, stated they had suspected this for a number of years and would continue to enhance their Pascal products and halt further efforts to develop C. An IBM spokesman broke into uncontrolled laughter and had to postpone a hastily convened news conference concerning the fate of the RS-6000, merely stating "Workplace OS will be available Real Soon Now." In a cryptic statement, Professor Wirth of the ETH institute and father of the Pascal, Modula 2, and Oberon structured languages, merely stated that P. T. Barnum was correct.
>
>In a related late-breaking story, usually reliable sources are stating that a similar confession may be forthcoming from William Gates concerning the MS-DOS and Windows operating environments. And IBM spokesman have begun denying that the Virtual Machine (VM) product is an internal prank gone awry.

 No.20942

>>20940
shush don't let the Coyim know

 No.20943

>>20923
I like C because it is easy to leave vulnerabilities.
Lots of things use safer languages these days and it has ruined hacking as a hobby imho, but you are right about embeded devices as they are still something very fun, exclusively targeting routers, cameras, switches, what these 'people' were marketing as 'internet of things' a few years back, and so on quite often you will find poorly writen C/C++, old linux, pre-ASLR kernels and etc etc.
It's good times, but as a language probably not so often great for our current mode of production, often i am sure more and more we will see businesses move away from it.

 No.20944

>>20943
You can make your C even more underhanded by compiling it with a C++ compiler. For example, C++ has a wider class of lvalues whose assignment evaluation may be compiler dependent. The program:
#include <stdio.h>
int main(void) {
  char b[4]="ABC"; char*a;
  a=b+1
  *(++a)=a[-1];
  puts(b);
  return 0;
}

outputs ABB when compiled with g++ and ABA when compiled with clang++.

 No.20945

>>20944
I heard that C with a C compiler is not Turing complete? 😮

 No.20946

>>20945
I think this has to do with the requirement of a turing machine to process a theoretically infinite input "tape" conflicting with C's mandated use of fixnum pointers. You could get around this limitation by writing custom allocation and dereferencing routines, that use arbitrary length pointers and may grow a theoretically infinite page table by manually swapping to disk.

 No.20947

>>20946
Wouldn't that mean that it is then theoretically possible to do robust static analysis on programs for things like security vulnerabilities, segfaults, shit like that?

 No.20948

>>20947
Bound checking of arrays is already a staple of linters and has a gcc switch i believe. C programs are made unpredictable the most by the ambiguouity of API definitions.

Character strings may either be null terminated or passed with their length (functions handling both exist), yet there is nothing stopping the caller from violating both. While the compiler can hook into the allocator calls to perform static bound checks or warn about an index range, the programmer may misplace or inadvertently copy a null terminator and miscalculate a string length smaller than the data.
When passing string arguments to external functions, static analysis is simply impossible. The newer standards define some semantics whereby compiler can detect some of the possible errors though: https://gustedt.wordpress.com/2023/06/10/enforced-bounds-checking-for-frozen-function-interfaces/

 No.20949

>>20948
I thought the biggest issue is aliasing.

 No.20950

>>20949
Writing to an aliased pointers isn't strictly required for use of the language and can be forbidden with restrict from C99 or __restrict implemented by most of the C89 compilers.

Compilers can detect dangling pointers in function scope just as well as uninitialized variables. The state necessary for this is lost when pointers are passed to other non-inlined functions as arbitrary fixnums.
If however C pointers were a datatype with shared state that must point to a valid memory location, dangling pointers could at least be diagnosed with runtime checks. Some testing frameworks modify the allocator to track all current allocations.

In any case the semantics for passing standard C types require a lot of faith from both the caller and the subroutine, which i see as the primary defect at work.


Unique IPs: 9

[Return][Go to top] [Catalog] | [Home][Post a Reply]
Delete Post [ ]
[ home / rules / faq ] [ overboard / sfw / alt ] [ leftypol / siberia / edu / hobby / tech / games / anime / music / draw / AKM ] [ meta / roulette ] [ cytube / wiki / git ] [ GET / ref / marx / booru / zine ]