Skip to main content

Switching between PL/SQL, Java and Shell Scripts

Calling a SQL script from another SQL script:
Suppose we have two script files, main.sql and sub.sql. To call sub.sql from main.sql write in main.sql:

@sub.sql

The SQL code in sub.sql will be run as if it is written in main.sql

Calling a java method from a shell script:
We can only call main method of a class, by running that class with a java command line execution.

#!/bin/ksh
# Here I have already compiled Test.java
echo "Running Test"
java Test

Calling a shell script from a Java method:
You can run it using exec method of Runtime Object. Sample code:

import java.util.*;
import java.io.*;

class Execx{
public static void main(String args[])
{
Runtime r=Runtime.getRuntime();
try{
r.exec("javaex.sh");
}catch(IOException e){
e.printStackTrace();
}
}
}

But check this link also: http://www.javaworld.com/javaworld/jw-12-2000/jw-1229-traps.html

Calling a stored procedure from Java:

The following code puts the SQL statement into a string and assigns it to the variable createProcedure , which we will use later:

String createProcedure = "create procedure SHOW_SUPPLIERS " +
"as " +
"select SUPPLIERS.SUP_NAME, COFFEES.COF_NAME " +
"from SUPPLIERS, COFFEES " +
"where SUPPLIERS.SUP_ID = COFFEES.SUP_ID " +
"order by SUP_NAME";

The following code fragment uses the Connection object con to create a Statement object, which is used to send the SQL statement creating the stored procedure to the database:

Statement stmt = con.createStatement();
stmt.executeUpdate(createProcedure);


The procedure SHOW_SUPPLIERS will be compiled and stored in the database as a database object that can be called, similar to the way you would call a method.

JDBC allows you to call a database-stored procedure from an application written in the Java programming language. The first step is to create a CallableStatement object. As with Statement and PreparedStatement objects, this is done with an open Connection object. A CallableStatement object contains a call to a stored procedure; it does not contain the stored procedure itself. The first line of code below creates a call to the stored procedure SHOW_SUPPLIERS using the connection con. The part that is enclosed in curly braces is the escape syntax for stored procedures. When the driver encounters "{call SHOW_SUPPLIERS}”, it will translate this escape syntax into the native SQL used by the database to call the stored procedure named SHOW_SUPPLIERS.

CallableStatement cs = con.prepareCall("{call SHOW_SUPPLIERS}");
ResultSet rs = cs.executeQuery();

Calling PL/SQL procedure from shell script
This is easy. In the following example we have var1 available to us even after going in SQL Plus.

#!/bin/bash
var1='test'
PROC_DATE="'2002-03-22'"
sqlplus user/passwd@service <<>
Calling Shell Script from PL/SQL Procedure

This solution by Thomas Kyte has been taken from a Google group. This is a quick and dirty daemon.

Here is a PL/SQL subroutine you can install in your schema:

create or replace procedure host( cmd in varchar2 )
as
status number;
begin
dbms_pipe.pack_message( cmd );
status := dbms_pipe.send_message( 'HOST_PIPE' );
if ( status <> 0 ) then raise_application_error( -20001, 'Pipe error' );
end if;
end;
/

Here is a Shell script you can run in the background (make sure it is named host.sh)

#!/bin/csh -f
sqlplus tkyte/tkyte <<"EOF" grep '^#' sed 's/^.//' > tmp.csh
set serveroutput on
declare
status number;
command varchar2(255);
begin
status := dbms_pipe.receive_message( 'HOST_PIPE' );
if ( status <> 0 ) then
dbms_output.put_line( '#exit' );
else
dbms_pipe.unpack_message( command );
dbms_output.put_line( '##!/bin/csh -f' );
dbms_output.put_line( '#' command );
dbms_output.put_line( '#exec host.csh' );
end if;
end;
/
spool off
"EOF"

chmod +x tmp.csh
exec tmp.csh

If you run this in the background (The script), you'll be able to have it execute any host command you want. Run this in one window for example and in anther window go into sql*plus and try:

SQL> exec host( 'ls -l' );
SQL> exec host( 'uptime' );
SQL> exec host( 'echo Hello World' );
SQL> exec host( 'exit' );


You'll see the output of ls -l, uptime, and echo happen on the other window where the shell script is running and you'll get real time feedback from your pl/sql procedure).

Comments

Popular posts from this blog

How to upload to Google Cloud Storage buckets using CURL

Signed URLs are pretty nifty feature given by Google Cloud Platform to let anyone access your cloud storage (bucket or any file in the bucket) without need to sign in. Official documentation gives step by step details as to how to read/write to the bucket using gsutil or through a program. This article will tell you how to upload a file to the bucket using curl so that any client which doesn't have cloud SDK installed can do this using a simple script. This command creates a signed PUT URL for your bucket. gsutil signurl -c 'text/plain' -m PUT serviceAccount.json gs://test_bucket_location Here is my URL: https://storage.googleapis.com/test_sl?GoogleAccessId=my-project-id@appspot.gserviceaccount.com&Expires=1490266627&Signature=UfKBNHWtjLKSBEcUQUKDeQtSQV6YCleE9hGG%2BCxVEjDOmkDxwkC%2BPtEg63pjDBHyKhVOnhspP1%2FAVSr%2B%2Fty8Ps7MSQ0lM2YHkbPeqjTiUcAfsbdcuXUMbe3p8FysRUFMe2dSikehBJWtbYtjb%2BNCw3L09c7fLFyAoJafIcnoIz7iJGP%2Br6gAUkSnZXgbVjr6wjN%2FIaudXIqA

Running Apache Beam pipeline using Spark Runner on a local standalone Spark Cluster

The best thing about Apache Beam ( B atch + Str eam ) is that multiple runners can be plugged in and same pipeline can be run using Spark, Flink or Google Cloud Dataflow. If you are a beginner like me and want to run a simple pipeline using Spark Runner then whole setup may be tad daunting. Start with Beam's WordCount examples  which help you quickstart with running pipelines using different types of runners. There are code snippets for running the same pipeline using different types of runners but here the code is running on your local system using Spark libraries which is good for testing and debugging pipeline. If you want to run the pipeline on a Spark cluster you need to do a little more work! Let's start by setting up a simple standalone single-node cluster on our local machine. Extending the cluster is as easy as running a command on another machine, which you want to add to cluster. Start with the obvious: install spark on your machine! (Remember to have Java a

java.lang.IllegalArgumentException: Malformed \uxxxx encoding

I was getting this exception during build while running ant. Googling didn't help much and I was flummoxed because the same code was running fine till now. My code reads a text file and does some operations on the basis of values read. It was only when I saw the text files I understood the error. I had copied the text in wordpad and saved it as .txt file. Wordpad had put lot of formatting information before and after the content. Also there was "\par" after every line, which was giving this error. So moral of the story: if you get this exception check your properties file (or any other file that your code might be reading.)