The paradigms from different programming languages aren't always compatible. The differences in those paradigms are the strengths and weaknesses that make languages better or worse for various applications. It's hard for me to imagine low-level plumbing that would give you the languages' strengths when mixing languages -- some mixes just don't make sense.
As an example... why would I want to call python from C? If I'm writing in C, its because I want high performance with low-level control of resources. To call a python function, I suddenly need to spin up the python virtual machine including its baggage like its garbage collector -- there goes my performance and low-level control, I'm suddenly running a massive stack that I didn't write!
Alternately, you could compile the python code into a bytecode (different than PVM bytecode) that C can call performantly... but to do that you lose the benefits of python. Suddenly your python code needs to be compiled after every change (slowing down development, one of Python's greatest benefits) and you're required to declare and enforce strict types (no more easy duck typing!) so that C can count on getting back data in the expected format.
In contrast, calling C from Python does make sense and can already be done.
And writing plumbing to handle generic interoperability between 2 languages would be a lot of work. Writing parts of a project or system in different languages and then having them communicate through a standard interface like an API is much less work.
I guess I'll say that minicomputer and mainframe implementations of languages like Pascal and FORTRAN and COBOL and even BASIC back in the 1970s were frequently built with interoperability in mind, like you bought all the compilers in one big bundle. (Maybe like gcc or the the llvm-based compiler suite today?) It might have been easier because most of these languages had minimal runtime systems and the science of activation records and language implementation was mostly figured out by 1975 when Scheme came out with closures.
More modern languages have data structures that are similar but different that are a challenge. For instance the list types in Java and Python as they are used in common software are similar in a lot of ways but different. C has arrays in the stdlib but not that kind of expandable list, C developers might write their own or get one out of a library, I guess C++ has std::list but that doesn't have a fast way to get the n-th element of a list. Common Lisp has its own idea of a "list" and Clojure lives in the JVM and can access a Java List just fine but has its own immutable list which has a totally different API than thoses other language (e.g. no nconc!)
In general it is not so hard to access those objects through the 'foreign function API' or whatever you have but you either write stuff in the host language which is built especially to work with the guest language, or you make a wrapper, or you copy the data structures wholesale. It is never going to be trivial.
Similarly a language like Java has garbage collection and Python and Rust have their own forms of reference counting, either way you will either do your own memory management in the guest language or use some APIs to participate in the memory management of the host but it is a hassle.
For a long time the standard architecture for complex systems has been a scripting language + a systems language. Like Lua and C or Python and C or maybe Clojure and Java.
The paradigms from different programming languages aren't always compatible. The differences in those paradigms are the strengths and weaknesses that make languages better or worse for various applications. It's hard for me to imagine low-level plumbing that would give you the languages' strengths when mixing languages -- some mixes just don't make sense.
As an example... why would I want to call python from C? If I'm writing in C, its because I want high performance with low-level control of resources. To call a python function, I suddenly need to spin up the python virtual machine including its baggage like its garbage collector -- there goes my performance and low-level control, I'm suddenly running a massive stack that I didn't write!
Alternately, you could compile the python code into a bytecode (different than PVM bytecode) that C can call performantly... but to do that you lose the benefits of python. Suddenly your python code needs to be compiled after every change (slowing down development, one of Python's greatest benefits) and you're required to declare and enforce strict types (no more easy duck typing!) so that C can count on getting back data in the expected format.
In contrast, calling C from Python does make sense and can already be done.
And writing plumbing to handle generic interoperability between 2 languages would be a lot of work. Writing parts of a project or system in different languages and then having them communicate through a standard interface like an API is much less work.
Like interoperable in the sense that I could write a function in C and call it in Rust?
Yes, essentially. It's been a while since I've written Rust, but I am pretty sure that's (calling C from Rust) already possible.
So imagine whatever Rust has with C but with many other languages.
I guess I'll say that minicomputer and mainframe implementations of languages like Pascal and FORTRAN and COBOL and even BASIC back in the 1970s were frequently built with interoperability in mind, like you bought all the compilers in one big bundle. (Maybe like gcc or the the llvm-based compiler suite today?) It might have been easier because most of these languages had minimal runtime systems and the science of activation records and language implementation was mostly figured out by 1975 when Scheme came out with closures.
More modern languages have data structures that are similar but different that are a challenge. For instance the list types in Java and Python as they are used in common software are similar in a lot of ways but different. C has arrays in the stdlib but not that kind of expandable list, C developers might write their own or get one out of a library, I guess C++ has std::list but that doesn't have a fast way to get the n-th element of a list. Common Lisp has its own idea of a "list" and Clojure lives in the JVM and can access a Java List just fine but has its own immutable list which has a totally different API than thoses other language (e.g. no nconc!)
In general it is not so hard to access those objects through the 'foreign function API' or whatever you have but you either write stuff in the host language which is built especially to work with the guest language, or you make a wrapper, or you copy the data structures wholesale. It is never going to be trivial.
Similarly a language like Java has garbage collection and Python and Rust have their own forms of reference counting, either way you will either do your own memory management in the guest language or use some APIs to participate in the memory management of the host but it is a hassle.
For a long time the standard architecture for complex systems has been a scripting language + a systems language. Like Lua and C or Python and C or maybe Clojure and Java.