Why Oscript solutions are not passé

Over the years in my work with OpenText Livelink I am fortunate to associate myself with like minds and who like me share their code,goodwill and advice selflessly .I cherish and hold their friendship above all professional accolades I have ever got.Recently I had lunch with Ossie Moore  Ossie’s LinkedIn it was very short and I would have loved it to last longer as we could talk shop endlessly.Most of what I write in this has bearing to our lunch and he did steer this post. Ossie  for people who do not know him is quite a character.Let’s say you had a requirement,most of us would probably settle for a pretty decent code either in oscript,lapi as that was the predominant development methods one had at the time.Not Ossie he would probably write our kind of code and before releasing it would have at least two more better implementations,he would have thought about the user coming through the web,the other interfaces LL explorer and so on.That is the difference.

Let’s look at ECM deployments in major organizations.

X Organization has identified a massive unstructured data problem.X has a new manager on the block who has had experience with a technology vendor.Insert <vendorname> here. This is how most ECM deployments happen a lot of people passionate in some technology who would solutionize the deployment to suit the vendor.Most of them are assisted by Gartner, Forrester those kinds .I started working in OT technology before these rating companies came up with a barometer of wants so I am going to continue with my cynicism.

So after X Organization will roll out a OOB approach ,people are brought in.Naturally if the product is user friendly and easy its use will grow.Simple things like user activity,data growth etc are good ways to identify its usefulness.Now most people would want to send the link of the objects hoping that it won’t break on re-organizing the structure or re parenting as most of document work revolves around.So you may want to invest on something that people have put thought on like a GUID based system(Livelink, DCTM).Frankly I know only Livelink and DCTM very little. I hate Sharepoint (we call it $carepoint) with a passion because the smallest of things are so difficult to get  done.Good UI all the world has gone for it anything complex you have to buy or build with an array of .net developers different frameworks the works.It also could be that I have only been programming in that and spending time in LL so over the years the idiosyncracies of builder,CSIDE none of these matter to me now because Oscript really has not changed every much.Oscript is just a huge possibilities language. One of my mentors very famous in OT circles John Simon would joke like this.You have a problem you try solving in Oscript.You will spend about 3 to 4 days on it later to find out that what you need is totally available,no need to kludge just poor documentation that is all.

So if you were eager to get into using livelink to its full extent do these things

  1. Challenge a OT person or a OT solution marketer to what they are offering you. Challenge them extremely hard if it is offered at $$$ a user or so.That is how they make their money on licenses.
  2. Challenge if the pitch is towards Oscript is bad that spoils your chances of upgrade.There’s an element of truth in that.I have seen people hacking their way into core code,weblingo such no nos that shows inexperience and desperation. Oscript is not understanding the creation of a request handler,webnodeaction or eventscripting it is an all round knowledge of the product the passion about the product.You can immediately tell if the programmer knows anything because if they start saying about new subtypes you may want to ask them what prompted them to create a subtype, because many times those people have access to the oscript tutorials and the “Hello World” of oscript introduces them to that.You really want a Oscripter who will follow what OT advises them of doing in that case it is very rarely  a bad fit.
  3. Challenge if the pitched solution ends up working only in the webgui what about EC,what about SOAP,what about REST,what about Search

Oscript has years to go if the LL solution is still being used by customers.It can be learned to work wonders for you.The other aspect and full credit to Ossie here is LL software is the most open product much more open than opensource a oscripter can see the entire internals of it so people like me,John,Ossie all have found bugs and we all engage the good OT developers on our finds.Most senior and passionate oscripters continue to do so.There ‘s so much openness and welcoming in the OT programming community,people like David Templeton ,Kyle  Swidrowich and a countless amount of good people who have been striving to keep this open and getting communities to try it out for a better product.They have helped me and a lot of others and you can also get in into the action.

You just need the drive and passion to master LL that is all it takes.Well that maybe true for any technology 🙂

Advertisement

General Help Series 6 -Upgrading a Livelink System the 971 Part

Recently I was tasked with  upgrading our systems from its initial version to two times jump so 971 to 10.5.I will chronicle my experiences here. Perhaps provide a structure.I also want to quell some myths about the whole process just to make sure you can do this logically and correctly.

First create a draft plan,in my draft plan I included this.I would need at least one VM that can house or Production 971 Binaries,a clone of the database and a clone of the EFS.If one had been storing items in perhaps Archive Server one should have a playing copy of that also. An OTAS based livelink upgrade can pose its challenges if not done correctly but it is no different than a EFS if you do it wrong you can write test data into your productive store 🙂  This involves talking to other people like DBA’s and Storage folks as a heavily used LL system can have tons and tons of data.You will most likely encounter the after effects of long time use,different administrators and their styles etc.So after a few failed  attempts I understood why OT says a DB5 verification is very important.The Verification basically runs single threaded and will try to pinpoint anomalies in the database.Most likely it will pinpoint content loss such as bulk imports etc without actual version files et al .In my case the database had wrong pointers,duplicate dataid’s (wow),removed KUAf Id’s(somebody had fun with a oracle tool).So after going back and forth between OT and us I just thought like this and created a program(oscript) to check important structures.I list 3,your mileage will vary but essentially I grabbed all categories ,form templates and wf map structures.None of these checks happen in DB5 it just looks for existence.In hilarious cases I uncovered a category data structure mysteriously would turn up as a drawing PDF.One could argue that the users would have noticed, so this was an old category that nobody noticed.You don’t have to do this but each of those anomalies would prevent a upgrade and it is back and forth with OT support so I did it just to get some time advantage.

The second thing was I had to get all the optional modules  and our system is no different from any other system a lot of optional modules some inserting schema some not.

The third thing was our systems were customized with Oscript to provide enhanced user experience  as well as awesome home written modules that takes the tedium of long running things like category upgrades,permission pushes et al a very intelligent add on to the OI module that people can use generically to upload their data like that.So our team sat down and thought of what we could retire and what we should re-code,refactor so many of them were thrown out .Many useful ones we kept.We particularly liked the Distributed Agent framework,ours is very similar to that but if time comes and there is an appetite we would code it for that.

About the re-coding effort here’s the clunker,we were to use CSIDE .Naturally all of my team are hard core Builder aficionados and boy that was a journey.I probably think we were the first org to do something with CSIDE so naturally I gave them a lot of screenshots and dumps and OT dev was very receptive.In extremely tight time constraints we developed parallel modules in 97 1 and 10 and just just sucked them up for our SRC code maintaining thing called TFS. We also learned how to work parallel on oscript but not being a code churning house our code is not that complicated so if I was working on a module somebody else was working on another as well.

So the environment prepared was

  1. 971 VM with the same Oracle client and the same DB type of clone.
  2. I removed Prod EFS and Prod Admin server info from clone.
  3. We decided to rebuild search index as nobody had any clue whether it was good and it was eons before new search technology.
  4. A base install of LL was done and the working 971 binary copy from prod was  put as an overlay.Now many starting out people do not know this technique where you can get a copy of your prod binary replete with patches et al into a playing system.BTW this is what you hear as the “Parallel Upgrade” approach .Its database connection was redefined .In very olden times when there was no differentiation between 32 and 62 bits and VM’s were not popular and cheap, it was possible to use the existing boxes and run the upgrade.This was the “Update an Instance” method.This is largely vestigial at this time and very error prone not to say you would have a real hard time going back to the working version 🙂
  5. All old 971 custom modules were removed so this database had only knowledge of OT software core and optional.
  6. Did my health check I said earlier like look at cats,form templates and wf maps,removed any that would hamper a upgrade.

 

At this point one can take this to a CS10.5 binary and start the upgrade.That is my next article.

 

BTW- I am not a novice when it comes to Upgrades I started using livelink software in 1999 and has worked in version 8.1.5 so I pretty much know the heartbeat of an upgrade and gets pleasure in challenges and how one methodically removes the challenges.Do not try an upgrade if you have not dry run the procedure without fail a couple of times.In most cases the upgrade can be done timely the planning can take weeks,if not months.

 

General Help Series 5 – Form Templates,Web Forms

Forms

Forms as the name implies is a object where you can capture input and store them in livelink  . The exact methods are only going to serve as what you design your forms to be.I wrote this because a user was asking about this in the KB

Main Attributes: Forms can be made to work only by installing the “Forms Module”,which in turn will give you a Form Template sub type(230). This serves as the straw man for your data requirements.

Installation of WEB FORMS will give you HTML form capability like wise if you install PDF Forms it will give you that capability.

Forms is used extensively in delivering a awesome user experience,however the form template code is very busy and in efficient as it will add unwanted(literally thousands of lines of javascript ) so anything you can do to optimize javascript you can do that. In the course of your work you will create a Form Template,create HTML Views so that user experience can be controlled via javascript logic,thousands and thousands of examples full forms etc in the KB.

In a article where I put earlier where we all were active in communities at one pointhttps://appukili.wordpress.com/2013/08/22/important-free-developer-links/

I find that a OT moderator putting my exact lines as I replied in a post

appu_form

Little has changed although the product from Patrick Vitali called “Answer Modules” has out rightly taken of the tedium of developing in forms.

Every Which way a Form can be used“https://knowledge.opentext.com/knowledge/cs.dll?func=ll&objId=3501264&objAction=ArticleView&viewType=1”

Almost all organizations will need  data in forms to be reported so use SQL as storage.If you keep it at default it could go as version files which is not parsable by a SQL tool nor will it have header info(you won’t know which field is which). Users using the Transmittal product usually get bitten by the fact that most of the work in transmittals revolve around forms and if you don’t make it SQL storage you will never be able to report on it.Actually on my first project installing transmittals I bungled and then I looked under the covers to understand it.Even now there is no mention in any guide saying change forms to SQL storage.A recent help to a assoc

You can very greatly experience a users’ livelink workflow experience by chnaging the process to use FORMS while maintaining the business logic of the workflow

Advanced Topics in Forms/Workflows

This I put here  and only if you know Oscript is this of any useadvanced Topics in Forms workflows

 

 

The Cobra Post

Moral Turpitude,

Rather than try to write her actual name Devayani Kbobragade we Indians or naturalized seem to come up with easy to pronounce american western names.I chose my indian nickname of Appu  which is not my official name.So I have Peter who is actually Pramvir,Mike who is no where near mike and so on.So let us dub her Cobra for shortness sake.

The hullabaloo has gone viral but here’s what the Cobra case will finally entail

  1. Because of extreme money and political standing it will look like US would have won but they will make it look like both sides have one.I am more or less reminded of how my father used to say this when we were kids .My father Achan, enjoyed a particular game of cards popular in kerala called 56.The intend of the game is to start a bidding war and try to win the bid .Assume my Achan’s opposite team bid 44 clubs and my Achan’s team could not defeat them by securing 13 more points ,then when I was standing near him and learning I would say dad you lost,but he would correct me and say no they won.Almost Zen like .So that is what will happen here US wins and India Wins
  2. Who lost-The poor Maid who the Indian Diplomat screwed literally.Also a bunch of Indian people who wants to rock the boat like the Aam Aadmi Party but would like the vip treatment to go on
  3. Why is it justful -Because US diplomats do not want to go to Indian jails for breaking arcane laws.We don’t even have good toilets in lockups/prisons let alone good ones for non criminals.We may do cavity searches but may not have gloves or it could be used ones.

So Cobra will get out scott free and it will really be point #3 of my blog that opens the eyes of the US diplomats

There isn’t a good proverb in english to say this but the closest I can put here

Heres’ the real one in malayalam “Aarante Ammakku Pranthu Vannal Kaanam Nalla Rasam”

Which roughly can be like “kick the can down the road”. The literal translations is basically it is rather fun to watch if someones mother not my mother is gone postal.

 

 

 

How to use WCF Livelink web services to create a quasi single sign on or login with cookie equivalent of lapi

Recently OT Dev said that don’t really spend time on the Search SOAP service but use the Restful search API that has been there all along these years.

Since almost everything in livelink is served off a URL one of the first things it needs is authentication.For livelink deployments who do not have Single Sign On based on web server auth it is inconvenient that a webapp designed in C# provides links to a livelink URL for e.g I may have an href called “My Assignments” and then I may link it to my livelink server url and the query string ending in personal .assignments

I really don’t think organizations using livelink still use userid/password but this will probably help those users.

http:://my livelinkurl/livelinkvd/llisapi.dll?func=personal.assignments

Now everybody reading this should know that livelink will look for a cookie if not it will present a login screen. Our attempt is to use the token (cookie) returned to us by the CWS auth routine and make sure we can pass it off to a livelink URL or making the request to livelink as if one had logged in and subsequently performing operations.

In my example I am doing this with a LL 9.7.1 version so the ref key word is not used.For newer CWS the ref keyword is needed

SETUP

LL9.7.1 Oracle,IIS6, webserver is anonymous,livelink auth scheme is livelink ,No RCS present,No Dir Svcs module in deployment

in SSO deployments calling a livelink URL from a auth user’s computer results in a pass thru experience so none of this circus is needed anyway.

What I found was if you were adding the LlCookie to the request you have to do a lot of coding as in the user in this thread.I found several hits in the web

to spoof the Cookie but a lot of code for somethingthat you know is not that secure anyway

RE RE RE RE RE Get Region Name

He first gets the auth token and uses cookie setting code to call the search API.

While that is all good and dandy  if you have access to  a web debugging tool like Fiddler if you capture traffic for the first auth call you can see your userid+password

if it is a HTTP connection.I am not sure what it will look if my livelink was HTTPS.So I would just build a userid/password url and specify everything in NextURL.

So if I was coding a C# app and wanted to call my search system in livelink I would just use the simple approach

http://llappu971vm:8080/livelink/livelink.exe?func=ll.login&username=livelinkuser&password=livelinkpassword&nextURL=%2Flivelink%2Flivelink.exe%3Ffunc=ll%26objId=670175%26objAction=browse%26viewType=1

In the above URL the Func=ll.login sets the Cookie and then the NextURL is indicated,it is just webescaped for transmission

Link to Creating RESTful web services in livelink

This is a very good article very insightful for people like me who don’t really  understand hyped up words  in use.

http://ednortonengineeringsociety.blogspot.com/2009/08/creating-restful-web-services-with.html

People at OT has said that their plan is to support web services ,oscript and RESTful

So I am going to try and see if I can make some good easy to use things.Looks easier than CWS to me

I know my oscript so the rest is basically some crafted query strings .I s that what Rest is ?

BTW this is free publicity to the original blogger.If I should not be posting links let me know I will remove it.

Single Sign On and Content Web Services Update

Update 04/28/2016

Pre- Requisites– Livelink 10/10.5/16  Server configured for Single Sign On using IWA . This is for users/companies who allow REMOTE_USER to trickle into livelink. If you have configured your livelink to use OTDS or simply said if OTDS is actually responsible for providing the Authentication then this post does not address that.In CWS which involves a OTDS server you are sourcing your authentication(read a different WSDL than the livelink authentication WSDL) from that.The theory there is your wsapi computer or client will pass your kerberos authentication and OTDS will vet that with AD/DC and pass it back. This OTDS ticket is now usable in your code.

 

Sample Setup on the WSAPI WEBSERVER

  1. Copy the webservices from OpenText  to a folder in the IIS server.For ease we assume your livelink server is on the same host and knows how to handle REMOTE_USER.If you want to run your livelink on another server edit the Web.Config to give it the real server name and port.I doubt if my setup can span computers never tested it that way.
  2. Configure an application that looks like the screen capsso setup
  3. Make sure it has IWA turned onsso1and make sure Extended Protection is OFF
  4. If not already done create a logs directory that is at the same level as \bin and manage its permissions correctly.This can also be done by doing a setspn which is really an advanced topic.The intent is when the client code hits this WSDL the application pool identity will transfer your remote_user and windows will do all that mumbo_jumbo.windows acl
  5. Edit the Web.Config carefully after making a copy .I am showing you changes highlighted for Authentication,DocumentManagement& ContentService.The file is rife with several possibilities so start with these 3 and mature into SSL and others Web.Config
  6. Give the WSDL a whirl .If it cant create a service or the green lines it is not a fault of OT.This is all windows WCF configuration.Check Microsoft sites or google for your errors.Things I check is that if all is successful the logs directory will show windows logs
  7. If the WSDL can now be put into a real source code file and you get errors in your client file the log files will indicate the errors coming from livelink. Note if you were using code form anonymous livelink/wsapi  you need to rebuild the source after updating the wsdl’scode
  8. The above were tested on a CS10 with oscript directory services and patches for SSO. I wrote a client WS which uses windows authentication and gave it to our app dev team.

If you have the chance to code in this you will be very pleased

LAPI What is It and Should I use it ?

Update:06/12/2017

Realized that OT is supporting old LAPI for their product called Enterprise Scan,Escan used to be completely a client app so if the Livelink server connecting to it is CS16 then OT ask you to install a module.Surreptitiously they call it “Private Bridge Interface” which is all the LAPI Oscript code conveniently bundled into its own Ospace.My guess is this will continue until ESCAN and AGA and other OT bread winners convert to SOAP or REST.

 

Update:12/29/2015 OT has announced that with the launch of CS 16 there will be no server side oscript to receive lapi calls so don’t waste your time reading this or programming in it.Just keeping this for vestigial reasons.

At time of this writing circa Feb 2013. Open Text’s livelink has been in the market for about 18 years and a fairly loyal fan share.The core software was always enhanced /added on by Oscript but for programmers who wished to do utilities such as adding user’s en masse, deleting things en masse et there was LAPI which stood for Livelink Application Programming Interface  .In its inception what I used to see in around 2000,you could program in Java,which I did,there was VB6,there was then C & C++. Vended products such as Livelink Explorer,Authorlink, Autocad Integration etc was around there.In a nut shell these were the steps that one used to take.

  1. Install or copy lapi client files in your computer
  2. Write a program
  3. Compile it and run.

@RunTime the first call which everybody knew as Session creation was akin to Tel netting to the livelink server on Port 2099(default livelink server listening port). As time went by OT added a lot of calls at the LAPI server side so people built a lot of cool things.But there were problems in a way that as modern OOP languages came along they abhorred the idea that livelink was giving them Data structures(LLValue Objects) and making them work thru that.So OT did a small stint they loaded the LAPI libraries on top of a web server and called it Live Services but at that point one could write a web service call.I think it died like the OT’s foray into JavaObjects to replace or do side by side coding as in Oscript. I never understood what one would use JavaObjects for. Note I am not talking about the Oscript to Java Bridge which I have messed with and is very useful.

With the release of 9.7.1 they bundled livelink web services as part of the livelink install the idea bing a company would just use the existing IIS or Tomcat investment and start creating a plumbing conduit into livelink.At CS10 they stopped making a installer for LAPI. I am told that this is because  they do not have the development resources to make libraries in several different languages.

The new web services are strongly typed,you never need to guess what you ask livelink and what you get in return.But while that is easy for a programmer to say and do the majority of programmers are called /hired to do something against livelink has no clue how livelink works.So that is where beautifully written /working hand holding programs are needed.Also a programmer need to understand what he is trying to do which may be as simple as logging into livelink and observing the different things.Jason Smith,Scott Grasley,Kyle Swidrowich these are all great OT programmers who are putting such structure into the new web services paradigm.Whenever I find time and needs some challenging stuff I also do put up some samples but most of my time nowadays are in oscript.

Coming back to LAPI the server side remains and will there be for what I can see. If OT basically puts their foot down and remove the API on the server then your program will not work. But then gain it probably is so far around you have no cause for concern.Not having an installer should be less worrying as any old API edition will work in java or a .NET language.That is what I do.I have a new 64 bit computer.I just copy the dll’s from a 32 bit machine and work on a .Net 2.0 framework.I also lost touch of java somewhere along the way.I guess I will have to brush up because OT is going to get us all programming in Eclipse 🙂

So in conclusion it really is your preference on how to code and how well to save your investment

A LAPI call received in the heart of livelink in oscript.If the AccessEnterprise call one sample was not there your investment would suffer

apicall

Going by the same token a web services call received in livelink

webservicecall

As you can see when you program in a client application your call needs to come into livelink.In web services they use the web server to get into livelink code.

At this point you are able to create a lapi code page and call the web services code or a web services code piece and call lapi .Increasingly over time lapi server side will cease to be and no effort will go into its up keep.In older versions of web services they used the LAPI dispatcher to send in the web services command now I don’t think it has a dependency on that.

Chris Morley’s comment a little more computer trained person than me

https://knowledge.opentext.com/knowledge/cs.dll?func=ll&objId=33077212&objAction=viewincontainer

Almost Diatribe from an angry user it is fun